I'm getting the following error all of a sudden on a script: ' dzVents: Error (2.4.7): Error parsing json to LUA table: /home/pi/domoticz/scripts/dzVents/../lua/JSON.lua:1009: Lua script execution exceeds maximum number of lines'
return {
on = {
timer = { 'every 5 minutes' },
httpResponses = { 'blitz' } -- matches callback string below
},
data = {
last = { initial = 0 }
},
execute = function(domoticz, triggerItem)
local lightning = domoticz.devices('Bliksemteller')
local preLight = tonumber(lightning.rawData[1])
distance = function(lat1, lng1, lat2, lng2)
radius = 6371
dLat = (lat2 - lat1) * math.pi / 180
dLng = (lng2 - lng1) * math.pi / 180
lat1 = lat1 * math.pi / 180
lat2 = lat2 * math.pi / 180
val = math.sin(dLat / 2) * math.sin(dLat / 2) + math.sin(dLng / 2) * math.sin(dLng / 2) * math.cos(lat1) * math.cos(lat2)
ang = 2 * math.atan2(math.sqrt(val), math.sqrt(1 - val))
return radius * ang
end
local latHome = 51.860069 --replace with your own coordenates
local lngHome = 4.4122027 --replace with your own coordenates
local distanceRange = 10 --change to the maximum distance you want for filtering (in KMs)
local last = tonumber(domoticz.data.last)
if (triggerItem.isTimer) then
domoticz.openURL({
url = 'https://www.onweeractueel.nl/domoticz_bo.json',
method = 'GET',
callback = 'blitz'
})
elseif (triggerItem.isHTTPResponse) then
local response = triggerItem
if (response.ok and response.isJSON) then
local value = 0
local ignored = 0
tl = #response.json
tc = 1
repeat
local times = tonumber(response.json[tc][1])
local lat = tonumber(response.json[tc][2])
local lng = tonumber(response.json[tc][3])
local distanceBetween = distance(latHome, lngHome, lat, lng)
if (distanceBetween <= distanceRange) then
if (times > last) then
value = value + 1
else
value = 0
end
domoticz.data.last = times
end
tc = tc + 1
until tc > tl
print('Blitz Value = '..value)
if value ~= preLight then
lightning.updateCustomSensor(value)
end
else
print('**blitz failed to fetch info')
end
end
end
}
Last edited by EdwinK on Sunday 29 July 2018 10:53, edited 1 time in total.
Running latest BETA on a Pi-3 | Toon® Thermostat (rooted) | Hue | Tuya | IKEA tradfri | Dashticz V3 on Lenovo Huawei Tablet | Conbee
I have been looking into this a lot and never found a solution/workaround. Never figured it was a limitation in Domoticz.
It does make sense in the way that you would not want the event system to hang.
In the case of the script mentioned by the OP, not rarely the json result is over 20000 lines long and I figure that it might take too long to process them all.
I have contacted the owner/admin of onweeractueel to see if the results can be filtered (by setting max/min lon and lat) but never got an answer to that.
Also investigating alternative providers, but there are not many, and the existing ones are rather expensive
elmortero wrote: Sunday 29 July 2018 12:39
I have been looking into this a lot and never found a solution/workaround. Never figured it was a limitation in Domoticz.
It does make sense in the way that you would not want the event system to hang.
In the case of the script mentioned by the OP, not rarely the json result is over 20000 lines long and I figure that it might take too long to process them all.
I have contacted the owner/admin of onweeractueel to see if the results can be filtered (by setting max/min lon and lat) but never got an answer to that.
Also investigating alternative providers, but there are not many, and the existing ones are rather expensive
Seems indeed a hard coded limit in json_reader.cpp (stackLimit_g) to prevent a situation where a bad json could cause a segmentation fault.
If onweeractueel would extend their api with an option to filter the result on location and/or time, it would add more user control to the size of the response.
waaren wrote: Sunday 29 July 2018 12:09
@EdwinK, this is a known problem and reported here
Yes, but I never got a response..
Greetz,
Oli
@EdwinK, @elmortero, today the question was bumped and we received a reply !
"At this moment we are rebuild the script where an api is do the job where you can receive based on a unique location the info in place of all the unrelated info."