Domoticz crash when 'Dummy' is enabled

Please use template to report bugs and problems. Post here your questions when not sure where else to post
Only for bugs in the Domoticz application! other problems go in different subforums!

Moderators: leecollings, remb0

Forum rules
Before posting here, make sure you are on the latest Beta or Stable version.
If you have problems related to the web gui, clear your browser cache + appcache first.

Use the following template when posting here:

Version: xxxx
Platform: xxxx
Plugin/Hardware: xxxx
Description:
.....

If you are having problems with scripts/blockly, always post the script (in a spoiler or code tag) or screenshots of your blockly

If you are replying, please do not quote images/code from the first post

Please mark your topic as Solved when the problem is solved.
Post Reply
voyo
Posts: 40
Joined: Monday 17 February 2020 19:16
Target OS: Raspberry Pi / ODroid
Domoticz version: beta
Location: Poland
Contact:

Domoticz crash when 'Dummy' is enabled

Post by voyo »

Hi,
Im having issue with Domoticz :(
It started on last Sunday, Domoticz started to crashing, I noticed after some time. After attempts to fix I gave up as it was late, and I just restored db from last day. I tought it could be cause by PluginManager which I enabled, added some new plugins etc.
Anyway - I restored backup and removed all new stuff that I added. It helped. I tough I will check later what was wrong.
But today - again, Domoticz is crashing , and I didnt touch it yet... Again - helping is just restoring old DB backup. But well... this is not an acceptable approach on long run :/
TLDR - Guilty is Dummy hardware device, and I need help troubleshooting it further :(

Context, environment and troubleshoot steps so far:
Domoticz x86, running on Docker on fast (enough) server.
Status: Domoticz V2025.1 (build 16611) (c)2012-2025 GizMoCuz
2025-09-09 00:04:25.251 Status: Build Hash: 506720457, Date: 2025-04-03 16:12:32
Database - no visible issues, I checked with integrity_check , recovery, and dump'ed everything to fresh file. No joy.
I disabled hardware, one by one , to find problematic one (update hardware set Enabled=0 where id=xx; )
Removed all my plugins , and scripts - physically, as files.
Even with everything removed, when only 'Dummy' is enabled - Domoticz is crashing very quickly.
I have plenty of devices based on 'Dummy', disabling them one by one is will be loooong night...
I really need good suggestions.
Docker is run with
EXTRA_CMD_ARG="-loglevel normal,status,error,debug -debuglevel normal,hardware,received,webserver,eventsystem,python,thread_id"
and here is crash:

(I just noticed there are still scripts enabled, probably autogenerated Blockly, dont know how to disable them .. but still - this is something I didnt touch).


2025-09-09 00:11:17.565 Launch: Begin container self-repair
2025-09-09 00:11:18.110 Launch: End container self-repair
2025-09-09 00:11:18.168 Status: Domoticz V2025.1 (build 16611) (c)2012-2025 GizMoCuz
2025-09-09 00:11:18.168 Status: Build Hash: 506720457, Date: 2025-04-03 16:12:32
2025-09-09 00:11:18.168 Status: Startup Path: /opt/domoticz/
2025-09-09 00:11:18.188 Sunrise: 06:08:00 SunSet: 19:07:00
2025-09-09 00:11:18.188 Day length: 13:00:00 Sun at south: 12:38:00
2025-09-09 00:11:18.188 Civil twilight start: 05:35:00 Civil twilight end: 19:40:00
2025-09-09 00:11:18.188 Nautical twilight start: 04:56:00 Nautical twilight end: 20:20:00
2025-09-09 00:11:18.188 Astronomical twilight start: 04:13:00 Astronomical twilight end: 21:02:00
2025-09-09 00:11:18.226 Status: PluginSystem: Started, Python version '3.9.2', 0 plugin definitions loaded.
2025-09-09 00:11:18.230 Active notification Subsystems: email, telegram (2/13)
2025-09-09 00:11:18.232 Status: WebServer(HTTP) started on address: :: with port 8080
2025-09-09 00:11:18.239 Status: WebServer(SSL) started on address: :: with port 443
2025-09-09 00:11:18.241 Status: Camera: settings (re)loaded
2025-09-09 00:11:18.242 Starting shared server on: :::6144
2025-09-09 00:11:18.242 Status: TCPServer: shared server started...
2025-09-09 00:11:18.244 Status: RxQueue: queue worker started...
2025-09-09 00:11:19.281 Status: [web:8080] Incoming connection from: 10.0.20.6
2025-09-09 00:11:20.209 Status: [web:8080] Incoming connection from: 10.0.20.129
2025-09-09 00:11:20.245 Status: NotificationSystem: thread started...
2025-09-09 00:11:20.246 Status: EventSystem: reset all events...
2025-09-09 00:11:20.250 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/CWU.lua
2025-09-09 00:11:20.250 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/przyciski rolet.lua
2025-09-09 00:11:20.251 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/deszczówka_average.lua
2025-09-09 00:11:20.251 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/timeKeeper-LampaGarderoba.lua
2025-09-09 00:11:20.252 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/timeKeeper-LampaPralnia.lua
2025-09-09 00:11:20.252 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/timeKeeper-LampaPiwnica.lua
2025-09-09 00:11:20.253 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/timeKeeper-LampaWiatrołap.lua
2025-09-09 00:11:20.254 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/timeKeeper-LampaHol.lua
2025-09-09 00:11:20.254 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/timeKeeper-LampaKorytarzDoPiwnicy.lua
2025-09-09 00:11:20.255 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/energy cost.lua
2025-09-09 00:11:20.256 Status: dzVents: Write file: /opt/domoticz/userdata/scripts/dzVents/generated_scripts/isHoliday.lua
2025-09-09 00:11:20.256 Status: EventSystem: reset all device statuses...
terminate called after throwing an instance of 'std::invalid_argument'
what(): stoull
2025-09-09 00:11:20.286 Error: Domoticz(pid:1, tid:33('MainWorker')) received fatal signal 6 (Aborted)
2025-09-09 00:11:20.286 Error: siginfo address=0x1, address=0x7fc55ba0ed51
2025-09-09 00:11:20.310 Error: Failed to start gdb, will use backtrace() for printing stack frame

2025-09-09 00:11:20.318 Error: #0 /opt/domoticz/domoticz : + 0x418a43 [0x555cfd191a43]
2025-09-09 00:11:20.318 Error: #1 /opt/domoticz/domoticz : signal_handler(int, siginfo_t*, void*) + 0x29d [0x555cfd19252d]
2025-09-09 00:11:20.318 Error: #2 /lib/x86_64-linux-gnu/libpthread.so.0 : + 0x13140 [0x7fc55bd21140]
2025-09-09 00:11:20.318 Error: #3 /lib/x86_64-linux-gnu/libc.so.6 : gsignal + 0x141 [0x7fc55ba0ed51]
2025-09-09 00:11:20.318 Error: #4 /lib/x86_64-linux-gnu/libc.so.6 : abort + 0x123 [0x7fc55b9f8537]
2025-09-09 00:11:20.318 Error: #5 /opt/domoticz/domoticz : + 0x30a693 [0x555cfd083693]
2025-09-09 00:11:20.318 Error: #6 /opt/domoticz/domoticz : __cxxabiv1::__terminate(void (*)()) + 0x6 [0x555cfd912846]
2025-09-09 00:11:20.318 Error: #7 /opt/domoticz/domoticz : + 0xb998b1 [0x555cfd9128b1]
2025-09-09 00:11:20.318 Error: #8 /opt/domoticz/domoticz : + 0xb99a05 [0x555cfd912a05]
2025-09-09 00:11:20.318 Error: #9 /opt/domoticz/domoticz : std::__throw_invalid_argument(char const*) + 0x3d [0x555cfd085165]
2025-09-09 00:11:20.318 Error: #10 /opt/domoticz/domoticz : + 0x35a374 [0x555cfd0d3374]
2025-09-09 00:11:20.318 Error: #11 /opt/domoticz/domoticz : CEventSystem::GetCurrentStates() + 0xf00 [0x555cfd0f7ec0]
2025-09-09 00:11:20.318 Error: #12 /opt/domoticz/domoticz : CEventSystem::StartEventSystem() + 0x110 [0x555cfd0f8420]
2025-09-09 00:11:20.318 Error: #13 /opt/domoticz/domoticz : MainWorker::Do_Work() + 0x49c [0x555cfd15aa4c]
2025-09-09 00:11:20.318 Error: #14 /opt/domoticz/domoticz : + 0xc132f0 [0x555cfd98c2f0]
2025-09-09 00:11:20.319 Error: #15 /lib/x86_64-linux-gnu/libpthread.so.0 : + 0x7ea7 [0x7fc55bd15ea7]
2025-09-09 00:11:20.319 Error: #16 /lib/x86_64-linux-gnu/libc.so.6 : clone + 0x3f [0x7fc55bad1acf]
2025-09-09 00:11:20.319 Error: Domoticz(pid:1, tid:33('MainWorker')) received fatal signal 11 (Segmentation fault) while backtracing
2025-09-09 00:11:20.319 Error: siginfo address=(nil), address=0x7fc55b9f8602
2025-09-09 00:11:20.325 Error: #0 /opt/domoticz/domoticz : + 0x418a43 [0x555cfd191a43]
2025-09-09 00:11:20.325 Error: #1 /opt/domoticz/domoticz : signal_handler(int, siginfo_t*, void*) + 0x280 [0x555cfd192510]
2025-09-09 00:11:20.325 Error: #2 /lib/x86_64-linux-gnu/libpthread.so.0 : + 0x13140 [0x7fc55bd21140]
2025-09-09 00:11:20.325 Error: #3 /lib/x86_64-linux-gnu/libc.so.6 : abort + 0x1ee [0x7fc55b9f8602]
2025-09-09 00:11:20.326 Error: #4 /opt/domoticz/domoticz : + 0x30a693 [0x555cfd083693]
2025-09-09 00:11:20.326 Error: #5 /opt/domoticz/domoticz : __cxxabiv1::__terminate(void (*)()) + 0x6 [0x555cfd912846]
2025-09-09 00:11:20.326 Error: #6 /opt/domoticz/domoticz : + 0xb998b1 [0x555cfd9128b1]
2025-09-09 00:11:20.326 Error: #7 /opt/domoticz/domoticz : + 0xb99a05 [0x555cfd912a05]
2025-09-09 00:11:20.326 Error: #8 /opt/domoticz/domoticz : std::__throw_invalid_argument(char const*) + 0x3d [0x555cfd085165]
2025-09-09 00:11:20.326 Error: #9 /opt/domoticz/domoticz : + 0x35a374 [0x555cfd0d3374]
2025-09-09 00:11:20.326 Error: #10 /opt/domoticz/domoticz : CEventSystem::GetCurrentStates() + 0xf00 [0x555cfd0f7ec0]
2025-09-09 00:11:20.326 Error: #11 /opt/domoticz/domoticz : CEventSystem::StartEventSystem() + 0x110 [0x555cfd0f8420]
2025-09-09 00:11:20.326 Error: #12 /opt/domoticz/domoticz : MainWorker::Do_Work() + 0x49c [0x555cfd15aa4c]
2025-09-09 00:11:20.326 Error: #13 /opt/domoticz/domoticz : + 0xc132f0 [0x555cfd98c2f0]
2025-09-09 00:11:20.326 Error: #14 /lib/x86_64-linux-gnu/libpthread.so.0 : + 0x7ea7 [0x7fc55bd15ea7]
2025-09-09 00:11:20.326 Error: #15 /lib/x86_64-linux-gnu/libc.so.6 : clone + 0x3f [0x7fc55bad1acf]
2025-09-09 00:11:20.326 Error: Domoticz(pid:1, tid:33('MainWorker')) received fatal signal 11 (Segmentation fault)
2025-09-09 00:11:20.327 Error: siginfo address=(nil), address=0x7fc55b9f8602
2025-09-09 00:11:20.347 Error: Failed to start gdb, will use backtrace() for printing stack frame

2025-09-09 00:11:20.351 Error: #0 /opt/domoticz/domoticz : + 0x418a43 [0x555cfd191a43]
2025-09-09 00:11:20.351 Error: #1 /opt/domoticz/domoticz : signal_handler(int, siginfo_t*, void*) + 0x29d [0x555cfd19252d]
2025-09-09 00:11:20.351 Error: #2 /lib/x86_64-linux-gnu/libpthread.so.0 : + 0x13140 [0x7fc55bd21140]
2025-09-09 00:11:20.351 Error: #3 /lib/x86_64-linux-gnu/libc.so.6 : abort + 0x1ee [0x7fc55b9f8602]
2025-09-09 00:11:20.351 Error: #4 /opt/domoticz/domoticz : + 0x30a693 [0x555cfd083693]
2025-09-09 00:11:20.351 Error: #5 /opt/domoticz/domoticz : __cxxabiv1::__terminate(void (*)()) + 0x6 [0x555cfd912846]
2025-09-09 00:11:20.351 Error: #6 /opt/domoticz/domoticz : + 0xb998b1 [0x555cfd9128b1]
2025-09-09 00:11:20.351 Error: #7 /opt/domoticz/domoticz : + 0xb99a05 [0x555cfd912a05]
2025-09-09 00:11:20.351 Error: #8 /opt/domoticz/domoticz : std::__throw_invalid_argument(char const*) + 0x3d [0x555cfd085165]
2025-09-09 00:11:20.352 Error: #9 /opt/domoticz/domoticz : + 0x35a374 [0x555cfd0d3374]
2025-09-09 00:11:20.352 Error: #10 /opt/domoticz/domoticz : CEventSystem::GetCurrentStates() + 0xf00 [0x555cfd0f7ec0]
2025-09-09 00:11:20.352 Error: #11 /opt/domoticz/domoticz : CEventSystem::StartEventSystem() + 0x110 [0x555cfd0f8420]
2025-09-09 00:11:20.352 Error: #12 /opt/domoticz/domoticz : MainWorker::Do_Work() + 0x49c [0x555cfd15aa4c]
2025-09-09 00:11:20.352 Error: #13 /opt/domoticz/domoticz : + 0xc132f0 [0x555cfd98c2f0]
2025-09-09 00:11:20.352 Error: #14 /lib/x86_64-linux-gnu/libpthread.so.0 : + 0x7ea7 [0x7fc55bd15ea7]
2025-09-09 00:11:20.352 Error: #15 /lib/x86_64-linux-gnu/libc.so.6 : clone + 0x3f [0x7fc55bad1acf]

^C
voyo
Posts: 40
Joined: Monday 17 February 2020 19:16
Target OS: Raspberry Pi / ODroid
Domoticz version: beta
Location: Poland
Contact:

Re: Domoticz crash when 'Dummy' is enabled

Post by voyo »

One more crash, I managed to disable all scripts (blockly etc), and data push to Influx.
And build Domoticz freshly from git.

$ git status
On branch development
Your branch is up to date with 'origin/development'.

$ git log
commit b67ec555ddb9fb968d2b4e41643301da4b47c4d8 (HEAD -> development, origin/development, origin/HEAD)
Author: GizMoCuz <[email protected]>
Date: Sun Aug 31 08:16:47 2025 +0200

Implemented: MQTT-AD, allow publishing messages

commit 3b2b01e39477e7bdd367a643bfa1c735635df6dc
Merge: 911a727e8 1fa1ef8ef
Author: KidDigital <[email protected]>
Date: Thu Aug 28 20:13:18 2025 +0200

Merge pull request #6379 from kiddigital/fix/compilewarningenphase

Fix Linux compile warnings EnphaseAPI



Interesting fact is, that when I start Domoticz (with Dummy hardware disabled) , then I enable it in UI - it works (until restart).
And devices based on Dummy - works, can be updated etc.
But when Domoticz is (re)started with Dummy enabled - it crashed miserably... so this is something what is affecting only during initilization.




voyo@swarog:~/domoticz.git$ ./domoticz -loglevel normal,status,error,debug -debuglevel normal,hardware,received,webserver,eventsystem,python,thread_id
2025-09-09 02:14:09.806 [7f3ff0e21e00] Status: Domoticz V2025.1 (build 16776) (c)2012-2025 GizMoCuz
2025-09-09 02:14:09.806 [7f3ff0e21e00] Status: Build Hash: b67ec555d, Date: 2025-08-31 08:16:47
2025-09-09 02:14:09.806 [7f3ff0e21e00] Status: Startup Path: /home/voyo/domoticz.git/
2025-09-09 02:14:09.839 [7f3ff0e21e00] Sunrise: 06:08:00 SunSet: 19:07:00
2025-09-09 02:14:09.839 [7f3ff0e21e00] Day length: 13:00:00 Sun at south: 12:38:00
2025-09-09 02:14:09.839 [7f3ff0e21e00] Civil twilight start: 05:35:00 Civil twilight end: 19:40:00
2025-09-09 02:14:09.839 [7f3ff0e21e00] Nautical twilight start: 04:56:00 Nautical twilight end: 20:20:00
2025-09-09 02:14:09.839 [7f3ff0e21e00] Astronomical twilight start: 04:13:00 Astronomical twilight end: 21:02:00
2025-09-09 02:14:09.914 [7f3ff0e21e00] Status: PluginSystem: Started, Python version '3.11.2', 1 plugin definitions loaded.
2025-09-09 02:14:09.926 [7f3ff0e21e00] Active notification Subsystems: (0/13)
2025-09-09 02:14:09.927 [7f3ff0e21e00] Debug: CWebServer::StartServer() : settings : 'server_settings[is_secure_=false, www_root='/home/voyo/domoticz.git/www', listening_address='::', listening_port='8080', vhostname='', php_cgi_path='']'
2025-09-09 02:14:09.929 [7f3ff0e21e00] Status: WebServer(HTTP) started on address: :: with port 8080
2025-09-09 02:14:09.932 [7f3ff0e21e00] Debug: [web:8080] Adding IPv4 network (127.0.0.*) to list of trusted networks.
2025-09-09 02:14:09.933 [7f3ff0e21e00] Debug: [web:8080] Adding IPv4 network (10.0.20.*) to list of trusted networks.
2025-09-09 02:14:09.933 [7f3ff0e21e00] Debug: [web:8080] Adding IPv4 network (172.16.0.*) to list of trusted networks.
2025-09-09 02:14:09.933 [7f3ff0e21e00] Debug: [web:8080] Adding IPv4 network (172.17.0.*) to list of trusted networks.
2025-09-09 02:14:09.933 [7f3ff0e21e00] Debug: [web:8080] Adding IPv4 network (10.0.21.*) to list of trusted networks.
2025-09-09 02:14:09.934 [7f3ff0e21e00] Debug: WebServer(HTTP) started with 245 Registered Commands
2025-09-09 02:14:09.934 [7f3ff0e21e00] Debug: cWebEm Registration: 10 pages, 9 actions, 4 whitelist urls, 11 whitelist commands
2025-09-09 02:14:09.936 [7f3ff0e21e00] Debug: CWebServer::StartServer() : settings : ssl_server_settings['server_settings[is_secure_=true, www_root='/home/voyo/domoticz.git/www', listening_address='::', listening_port='443', vhostname='', php_cgi_path='']', ssl_method='tls', certificate_chain_file_path='./server_cert.pem', ca_cert_file_path='./server_cert.pem', cert_file_path=./server_cert.pem', private_key_file_path='./server_cert.pem', private_key_pass_phrase='', ssl_options='single_dh_use', tmp_dh_file_path='./server_cert.pem', verify_peer=false, verify_fail_if_no_peer_cert=false, verify_file_path='']
2025-09-09 02:14:09.940 [7f3fed5fc6c0] Debug: [web:8080] Host:10.0.20.129 Uri:/json.htm?type=command&param=udevice&idx=1804&nvalue=0&svalue=0.00;11744.00;0.00&rssi=5
2025-09-09 02:14:09.940 [7f3fed5fc6c0] Debug: [web:8080] Request Headers:
Host: domoticz.castle.conserit.pl:8080
User-Agent: ESP Easy/20669/Nov 30 2023 19:13:53
Connection: close
Accept-Encoding: identity;q=1,chunked;q=0.1,*;q=0

2025-09-09 02:14:09.941 [7f3fed5fc6c0] Debug: [web:8080] IP (10.0.20.129) is within Trusted network range!
2025-09-09 02:14:09.941 [7f3fed5fc6c0] Debug: CWebServer::GetJSonPage: udevice : /json.htm?type=command&param=udevice&idx=1804&nvalue=0&svalue=0.00;11744.00;0.00&rssi=5
2025-09-09 02:14:09.944 [7f3fed5fc6c0] Debug: SQLH UpdateValueInt Woda HwID:3 DevID:83804 Type:243 sType:28 nValue:0 sValue:0.00;11744.00;0.00 IDX: 1804
2025-09-09 02:14:09.945 [7f3fed5fc6c0] Status: [web:8080] Incoming connection from: 10.0.20.129
2025-09-09 02:14:09.946 [7f3fed5fc6c0] Debug: Web ACLF: 10.0.20.129 - - [09/Sep/2025:02:14:09.940 +0200] "GET /json.htm?type=command&param=udevice&idx=1804&nvalue=0&svalue=0.00;11744.00;0.00&rssi=5 HTTP/1.1" 200 49 - "ESP Easy/20669/Nov 30 2023 19:13:53"
2025-09-09 02:14:09.946 [7f3ff0e21e00] Debug: [web:443] Enabled ciphers (TLSv1.2) ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384
2025-09-09 02:14:09.952 [7f3ff0e21e00] Debug: [web:443] 'DH PARAMETERS' found in file ./server_cert.pem
2025-09-09 02:14:09.954 [7f3ff0e21e00] Status: WebServer(SSL) startup failed on address :: with port: 443: bind: Permission denied, trying ::
2025-09-09 02:14:09.955 [7f3ff0e21e00] Debug: [web:443] Enabled ciphers (TLSv1.2) ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384
2025-09-09 02:14:09.958 [7f3ff0e21e00] Debug: [web:443] 'DH PARAMETERS' found in file ./server_cert.pem
2025-09-09 02:14:09.958 [7f3ff0e21e00] Status: WebServer(SSL) startup failed on address :: with port: 443: bind: Permission denied, trying 0.0.0.0
2025-09-09 02:14:09.959 [7f3ff0e21e00] Debug: [web:443] Enabled ciphers (TLSv1.2) ECDHE-ECDSA-AES128-GCM-SHA256:ECDHE-RSA-AES128-GCM-SHA256:ECDHE-ECDSA-AES256-GCM-SHA384:ECDHE-RSA-AES256-GCM-SHA384:ECDHE-ECDSA-CHACHA20-POLY1305:ECDHE-RSA-CHACHA20-POLY1305:DHE-RSA-AES128-GCM-SHA256:DHE-RSA-AES256-GCM-SHA384
2025-09-09 02:14:09.962 [7f3ff0e21e00] Debug: [web:443] 'DH PARAMETERS' found in file ./server_cert.pem
2025-09-09 02:14:09.962 [7f3ff0e21e00] Error: WebServer(SSL) startup failed on address 0.0.0.0 with port: 443: bind: Permission denied
2025-09-09 02:14:09.962 [7f3ff0e21e00] Error: WebServer(SSL) check privileges for opening ports below 1024
2025-09-09 02:14:09.963 [7f3ff0e21e00] Status: Camera: settings (re)loaded
2025-09-09 02:14:09.964 [7f3ff0e21e00] Starting shared server on: :::6144
2025-09-09 02:14:09.965 [7f3fe7fff6c0] Status: TCPServer: shared server started...
2025-09-09 02:14:09.965 [7f3ff0e21e00] Status: mDNS: Service started
2025-09-09 02:14:09.967 [7f3fe67fc6c0] Status: RxQueue: queue worker started...
2025-09-09 02:14:09.967 [7f3fe77fe6c0] Debug: mDNS: Local IPv4 address: 0.0.0.0
2025-09-09 02:14:09.967 [7f3fe77fe6c0] Debug: mDNS: Local IPv6 address: ::
2025-09-09 02:14:09.968 [7f3fe77fe6c0] mDNS: Service: _http._tcp.local.:443 for Hostname: włosań (2 sockets)
2025-09-09 02:14:09.968 [7f3fe77fe6c0] Debug: mDNS: Sending mDNS announce
2025-09-09 02:14:10.121 [7f3fed5fc6c0] Debug: [web:8080] Host:10.0.20.69 Uri:/json
2025-09-09 02:14:10.121 [7f3fed5fc6c0] Debug: [web:8080] Request Headers:
Host: 10.0.20.6:8080
Connection: Upgrade
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
Origin: http://10.0.20.6:8080
Sec-WebSocket-Version: 13
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9,pl;q=0.8
Cookie: grafana_session=5f3d4056db89daca1edbffcd2ebf97fe; grafana_session_expiry=1757377332
Sec-GPC: 1
Sec-WebSocket-Key: isjfV0cieKUC1MgBOecclQ==
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: domoticz

2025-09-09 02:14:10.121 [7f3fed5fc6c0] Debug: [web:8080] IP (10.0.20.69) is within Trusted network range!
2025-09-09 02:14:10.122 [7f3fed5fc6c0] Debug: Web ACLF: 10.0.20.69 - - [09/Sep/2025:02:14:10.121 +0200] "GET /json HTTP/1.1" 101 0 - "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36"
2025-09-09 02:14:10.136 [7f3fed5fc6c0] Debug: [web:8080] Host:10.0.20.69 Uri:/json
2025-09-09 02:14:10.136 [7f3fed5fc6c0] Debug: [web:8080] Request Headers:
Host: 10.0.20.6:8080
Connection: Upgrade
Pragma: no-cache
Cache-Control: no-cache
Upgrade: websocket
Origin: http://10.0.20.6:8080
Sec-WebSocket-Version: 13
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9,pl;q=0.8
Cookie: grafana_session=5f3d4056db89daca1edbffcd2ebf97fe; grafana_session_expiry=1757377332
Sec-GPC: 1
Sec-WebSocket-Key: aC8Ka6SNfaEDGYaF6CZmuQ==
Sec-WebSocket-Extensions: permessage-deflate; client_max_window_bits
Sec-WebSocket-Protocol: domoticz

2025-09-09 02:14:10.136 [7f3fed5fc6c0] Debug: [web:8080] IP (10.0.20.69) is within Trusted network range!
2025-09-09 02:14:10.137 [7f3fed5fc6c0] Debug: Web ACLF: 10.0.20.69 - - [09/Sep/2025:02:14:10.136 +0200] "GET /json HTTP/1.1" 101 0 - "Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/140.0.0.0 Safari/537.36"
2025-09-09 02:14:10.459 [7f3fed5fc6c0] Debug: [web:8080] Host:10.0.20.6 Uri:/
2025-09-09 02:14:10.459 [7f3fed5fc6c0] Debug: [web:8080] Request Headers:
Host: domoticz.castle.conserit.pl:8080
User-Agent: Monit/5.33.0
Accept: */*
Accept-Encoding: identity
Connection: close

2025-09-09 02:14:10.459 [7f3fed5fc6c0] Debug: [web:8080] IP (10.0.20.6) is within Trusted network range!
2025-09-09 02:14:10.460 [7f3fed5fc6c0] Debug: [web:/] modified to (/home/voyo/domoticz.git/www/index.html).
2025-09-09 02:14:10.463 [7f3fed5fc6c0] Status: [web:8080] Incoming connection from: 10.0.20.6
2025-09-09 02:14:10.463 [7f3fed5fc6c0] Debug: Web ACLF: 10.0.20.6 - - [09/Sep/2025:02:14:10.459 +0200] "GET / HTTP/1.1" 200 75754 - "Monit/5.33.0"
2025-09-09 02:14:11.968 [7f3fd7fff6c0] Status: NotificationSystem: thread started...
2025-09-09 02:14:11.969 [7f3fe6ffd6c0] Status: EventSystem: reset all events...
2025-09-09 02:14:11.970 [7f3fe6ffd6c0] Debug: EventSystem: Events (re)loaded
2025-09-09 02:14:11.970 [7f3fe6ffd6c0] Status: EventSystem: reset all device statuses...
terminate called after throwing an instance of 'std::invalid_argument'
what(): stoull
2025-09-09 02:14:12.011 [7f3fe6ffd6c0] Error: Domoticz(pid:1268, tid:1278('MainWorker')) received fatal signal 6 (Aborted)
2025-09-09 02:14:12.011 [7f3fe6ffd6c0] Error: siginfo address=0x3e8000004f4, address=0x7f3ff1aa9eec

2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: Did not find stack frame for thread (LWP 1278)), printing full gdb output:

2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1269]
2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1270]
2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1271]
2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1272]
2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1273]
2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1274]
2025-09-09 02:14:13.344 [7f3fe6ffd6c0] Error: > [New LWP 1275]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1276]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1277]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1278]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1279]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1280]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1281]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1282]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1283]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [New LWP 1293]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > [Thread debugging using libthread_db enabled]
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > Using host libthread_db library "/lib/x86_64-linux-gnu/libthread_db.so.1".
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 0x00007f3ff1aee545 in __GI___clock_nanosleep (clock_id=clock_id@entry=0, flags=flags@entry=0, req=0x7ffdd03ccc50, rem=0x7ffdd03ccc50) at ../sysdeps/unix/sysv/linux/clock_nanosleep.c:48
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 48 ../sysdeps/unix/sysv/linux/clock_nanosleep.c: No such file or directory.
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > Id Target Id Frame
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > * 1 Thread 0x7f3ff0e21e00 (LWP 1268) "domoticz" 0x00007f3ff1aee545 in __GI___clock_nanosleep (clock_id=clock_id@entry=0, flags=flags@entry=0, req=0x7ffdd03ccc50, rem=0x7ffdd03ccc50) at ../sysdeps/unix/sysv/linux/clock_nanosleep.c:48
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 2 Thread 0x7f3ff05ff6c0 (LWP 1269) "SQLHelper" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 3 Thread 0x7f3fefdfe6c0 (LWP 1270) "PluginMgr" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 4 Thread 0x7f3feedff6c0 (LWP 1271) "InfluxPush" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 5 Thread 0x7f3fee5fe6c0 (LWP 1272) "MQTTPush" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 6 Thread 0x7f3feddfd6c0 (LWP 1273) "Webem_ssncleane" 0x00007f3ff1b27ee6 in epoll_wait (epfd=13, events=0x7f3feddfc4c0, maxevents=128, timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 7 Thread 0x7f3fed5fc6c0 (LWP 1274) "WebServer_8080" 0x00007f3ff1b27ee6 in epoll_wait (epfd=9, events=0x7f3fed5fb500, maxevents=128, timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 8 Thread 0x7f3fecdfb6c0 (LWP 1275) "Scheduler" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 9 Thread 0x7f3fe7fff6c0 (LWP 1276) "TCPServer" 0x00007f3ff1b27ee6 in epoll_wait (epfd=16, events=0x7f3fe7ffe530, maxevents=128, timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 10 Thread 0x7f3fe77fe6c0 (LWP 1277) "mDnsWorker" 0x00007f3ff1b1d99c in __GI___select (nfds=21, readfds=0x7f3fe77fd8f0, writefds=0x0, exceptfds=0x0, timeout=0x7f3fe77fd820) at ../sysdeps/unix/sysv/linux/select.c:69
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 11 Thread 0x7f3fe6ffd6c0 (LWP 1278) "MainWorker" 0x00007f3ff1af2c17 in __GI___wait4 (pid=1284, stat_loc=0x7f3fe6ffb7e4, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:30
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 12 Thread 0x7f3fe67fc6c0 (LWP 1279) "MainWorkerRxMsg" __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x7f3fe67fbbb0, op=137, expected=0, futex_word=0x558ec758d3e8 <m_mainworker+6184>) at ./nptl/futex-internal.c:57
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 13 Thread 0x7f3fe5ffb6c0 (LWP 1280) "Watchdog" 0x00007f3ff1aee545 in __GI___clock_nanosleep (clock_id=clock_id@entry=0, flags=flags@entry=0, req=0x7f3fe5ffac20, rem=0x7f3fe5ffac20) at ../sysdeps/unix/sysv/linux/clock_nanosleep.c:48
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 14 Thread 0x7f3fe57fa6c0 (LWP 1281) "WebServer_8080" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 15 Thread 0x7f3fe4ff96c0 (LWP 1282) "WebServer_8080" syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.345 [7f3fe6ffd6c0] Error: > 16 Thread 0x7f3fd7fff6c0 (LWP 1283) "NotificationSys" __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x7f3fd7ffec00, op=137, expected=0, futex_word=0x558ec758cb3c <m_mainworker+3964>) at ./nptl/futex-internal.c:57
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > 17 Thread 0x7f3fd77fe6c0 (LWP 1293) "Plugin_ASIO" __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x0, op=393, expected=0, futex_word=0x558ec8887438) at ./nptl/futex-internal.c:57
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > Thread 17 (Thread 0x7f3fd77fe6c0 (LWP 1293) "Plugin_ASIO"):
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #0 __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x0, op=393, expected=0, futex_word=0x558ec8887438) at ./nptl/futex-internal.c:57
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #1 __futex_abstimed_wait_common (futex_word=futex_word@entry=0x558ec8887438, expected=expected@entry=0, clockid=clockid@entry=0, abstime=abstime@entry=0x0, private=private@entry=0, cancel=cancel@entry=true) at ./nptl/futex-internal.c:87
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #2 0x00007f3ff1aa4f7b in __GI___futex_abstimed_wait_cancelable64 (futex_word=futex_word@entry=0x558ec8887438, expected=expected@entry=0, clockid=clockid@entry=0, abstime=abstime@entry=0x0, private=private@entry=0) at ./nptl/futex-internal.c:139
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #3 0x00007f3ff1aa75d8 in __pthread_cond_wait_common (abstime=0x0, clockid=0, mutex=0x558ec88873d8, cond=0x558ec8887410) at ./nptl/pthread_cond_wait.c:503
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #4 ___pthread_cond_wait (cond=0x558ec8887410, mutex=0x558ec88873d8) at ./nptl/pthread_cond_wait.c:618
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #5 0x0000558ec7121c24 in boost::asio::detail::scheduler::run(boost::system::error_code&) [clone .isra.0] ()
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #6 0x0000558ec71220d1 in Plugins::BoostWorkers() ()
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #7 0x0000558ec7252a57 in thread_proxy ()
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #8 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #9 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > Thread 16 (Thread 0x7f3fd7fff6c0 (LWP 1283) "NotificationSys"):
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #0 __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x7f3fd7ffec00, op=137, expected=0, futex_word=0x558ec758cb3c <m_mainworker+3964>) at ./nptl/futex-internal.c:57
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #1 __futex_abstimed_wait_common (futex_word=futex_word@entry=0x558ec758cb3c <m_mainworker+3964>, expected=expected@entry=0, clockid=clockid@entry=1, abstime=abstime@entry=0x7f3fd7ffec00, private=private@entry=0, cancel=cancel@entry=true) at ./nptl/futex-internal.c:87
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #2 0x00007f3ff1aa4f7b in __GI___futex_abstimed_wait_cancelable64 (futex_word=futex_word@entry=0x558ec758cb3c <m_mainworker+3964>, expected=expected@entry=0, clockid=clockid@entry=1, abstime=abstime@entry=0x7f3fd7ffec00, private=private@entry=0) at ./nptl/futex-internal.c:139
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #3 0x00007f3ff1aa7baf in __pthread_cond_wait_common (abstime=<optimized out>, clockid=1, mutex=0x558ec758cae8 <m_mainworker+3880>, cond=0x558ec758cb10 <m_mainworker+3920>) at ./nptl/pthread_cond_wait.c:503
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #4 ___pthread_cond_clockwait64 (abstime=<optimized out>, clockid=1, mutex=0x558ec758cae8 <m_mainworker+3880>, cond=0x558ec758cb10 <m_mainworker+3920>) at ./nptl/pthread_cond_wait.c:682
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #5 ___pthread_cond_clockwait64 (cond=0x558ec758cb10 <m_mainworker+3920>, mutex=0x558ec758cae8 <m_mainworker+3880>, clockid=1, abstime=<optimized out>) at ./nptl/pthread_cond_wait.c:670
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #6 0x0000558ec6c2a631 in CNotificationSystem::QueueThread() ()
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #7 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #8 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.346 [7f3fe6ffd6c0] Error: > #9 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > Thread 15 (Thread 0x7f3fe4ff96c0 (LWP 1282) "WebServer_8080"):
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #2 0x0000558ec7201797 in http::server::CWebsocketHandler::Do_Work() ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > Thread 14 (Thread 0x7f3fe57fa6c0 (LWP 1281) "WebServer_8080"):
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #2 0x0000558ec7201797 in http::server::CWebsocketHandler::Do_Work() ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > Thread 13 (Thread 0x7f3fe5ffb6c0 (LWP 1280) "Watchdog"):
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1aee545 in __GI___clock_nanosleep (clock_id=clock_id@entry=0, flags=flags@entry=0, req=0x7f3fe5ffac20, rem=0x7f3fe5ffac20) at ../sysdeps/unix/sysv/linux/clock_nanosleep.c:48
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #1 0x00007f3ff1af2e53 in __GI___nanosleep (req=<optimized out>, rem=<optimized out>) at ../sysdeps/unix/sysv/linux/nanosleep.c:25
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6bc6fed in sleep_milliseconds(long) ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #3 0x0000558ec6c47db6 in Do_Watchdog_Work() ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #4 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.347 [7f3fe6ffd6c0] Error: > #6 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > Thread 12 (Thread 0x7f3fe67fc6c0 (LWP 1279) "MainWorkerRxMsg"):
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #0 __futex_abstimed_wait_common64 (private=0, cancel=true, abstime=0x7f3fe67fbbb0, op=137, expected=0, futex_word=0x558ec758d3e8 <m_mainworker+6184>) at ./nptl/futex-internal.c:57
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #1 __futex_abstimed_wait_common (futex_word=futex_word@entry=0x558ec758d3e8 <m_mainworker+6184>, expected=expected@entry=0, clockid=clockid@entry=1, abstime=abstime@entry=0x7f3fe67fbbb0, private=private@entry=0, cancel=cancel@entry=true) at ./nptl/futex-internal.c:87
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #2 0x00007f3ff1aa4f7b in __GI___futex_abstimed_wait_cancelable64 (futex_word=futex_word@entry=0x558ec758d3e8 <m_mainworker+6184>, expected=expected@entry=0, clockid=clockid@entry=1, abstime=abstime@entry=0x7f3fe67fbbb0, private=private@entry=0) at ./nptl/futex-internal.c:139
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #3 0x00007f3ff1aa7baf in __pthread_cond_wait_common (abstime=<optimized out>, clockid=1, mutex=0x558ec758d398 <m_mainworker+6104>, cond=0x558ec758d3c0 <m_mainworker+6144>) at ./nptl/pthread_cond_wait.c:503
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #4 ___pthread_cond_clockwait64 (abstime=<optimized out>, clockid=1, mutex=0x558ec758d398 <m_mainworker+6104>, cond=0x558ec758d3c0 <m_mainworker+6144>) at ./nptl/pthread_cond_wait.c:682
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #5 ___pthread_cond_clockwait64 (cond=0x558ec758d3c0 <m_mainworker+6144>, mutex=0x558ec758d398 <m_mainworker+6104>, clockid=1, abstime=<optimized out>) at ./nptl/pthread_cond_wait.c:670
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #6 0x0000558ec6c14257 in MainWorker::Do_Work_On_Rx_Messages() ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #7 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #8 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #9 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > Thread 11 (Thread 0x7f3fe6ffd6c0 (LWP 1278) "MainWorker"):
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1af2c17 in __GI___wait4 (pid=1284, stat_loc=0x7f3fe6ffb7e4, options=0, usage=0x0) at ../sysdeps/unix/sysv/linux/wait4.c:30
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #1 0x0000558ec6c47605 in dumpstack_gdb(bool) ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6c47c15 in signal_handler(int, siginfo_t*, void*) ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #3 <signal handler called>
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #4 __pthread_kill_implementation (threadid=<optimized out>, signo=signo@entry=6, no_tid=no_tid@entry=0) at ./nptl/pthread_kill.c:44
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1aa9f4f in __pthread_kill_internal (signo=6, threadid=<optimized out>) at ./nptl/pthread_kill.c:78
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #6 0x00007f3ff1a5afb2 in __GI_raise (sig=sig@entry=6) at ../sysdeps/posix/raise.c:26
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #7 0x00007f3ff1a45472 in __GI_abort () at ./stdlib/abort.c:79
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #8 0x0000558ec6b468f2 in __gnu_cxx::__verbose_terminate_handler() [clone .cold] ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #9 0x0000558ec72874ca in __cxxabiv1::__terminate(void (*)()) ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #10 0x0000558ec7287535 in std::terminate() ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #11 0x0000558ec7287688 in __cxa_throw ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #12 0x0000558ec6b48399 in std::__throw_invalid_argument(char const*) ()
2025-09-09 02:14:13.348 [7f3fe6ffd6c0] Error: > #13 0x0000558ec6b8e294 in unsigned long long __gnu_cxx::__stoa<unsigned long long, unsigned long long, char, int>(unsigned long long (*)(char const*, char**, int), char const*, char const*, unsigned long*, int) [clone .constprop.0] ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #14 0x0000558ec6bb1af0 in CEventSystem::GetCurrentStates() ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #15 0x0000558ec6bb20c0 in CEventSystem::StartEventSystem() ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #16 0x0000558ec6c127ad in MainWorker::Do_Work() ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #17 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #18 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #19 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > Thread 10 (Thread 0x7f3fe77fe6c0 (LWP 1277) "mDnsWorker"):
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1b1d99c in __GI___select (nfds=21, readfds=0x7f3fe77fd8f0, writefds=0x0, exceptfds=0x0, timeout=0x7f3fe77fd820) at ../sysdeps/unix/sysv/linux/select.c:69
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #1 0x0000558ec6d60d14 in domoticz_mdns::mDNS::mDnsMainLoop() ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #2 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #3 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > Thread 9 (Thread 0x7f3fe7fff6c0 (LWP 1276) "TCPServer"):
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1b27ee6 in epoll_wait (epfd=16, events=0x7f3fe7ffe530, maxevents=128, timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #1 0x0000558ec6dd7016 in boost::asio::detail::epoll_reactor::run(long, boost::asio::detail::op_queue<boost::asio::detail::scheduler_operation>&) ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #2 0x0000558ec71a3cae in boost::asio::detail::scheduler::run(boost::system::error_code&) [clone .isra.0] ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #3 0x0000558ec71a42f0 in std::thread::_State_impl<std::thread::_Invoker<std::tuple<tcp::server::CTCPServer::StartServer(std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)::{lambda()#1}> > >::_M_run() ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #4 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #6 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > Thread 8 (Thread 0x7f3fecdfb6c0 (LWP 1275) "Scheduler"):
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.349 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6c3c5cf in CScheduler::Do_Work() ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > Thread 7 (Thread 0x7f3fed5fc6c0 (LWP 1274) "WebServer_8080"):
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1b27ee6 in epoll_wait (epfd=9, events=0x7f3fed5fb500, maxevents=128, timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #1 0x0000558ec6dd7016 in boost::asio::detail::epoll_reactor::run(long, boost::asio::detail::op_queue<boost::asio::detail::scheduler_operation>&) ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #2 0x0000558ec71f30ce in boost::asio::detail::scheduler::run(boost::system::error_code&) [clone .isra.0] ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #3 0x0000558ec71f86ce in http::server::server_base::run() ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #4 0x0000558ec6ca1268 in std::thread::_State_impl<std::thread::_Invoker<std::tuple<http::server::CWebServer::StartServer(http::server::server_settings&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&, bool)::{lambda()#265}> > >::_M_run() ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #5 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #6 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #7 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > Thread 6 (Thread 0x7f3feddfd6c0 (LWP 1273) "Webem_ssncleane"):
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1b27ee6 in epoll_wait (epfd=13, events=0x7f3feddfc4c0, maxevents=128, timeout=-1) at ../sysdeps/unix/sysv/linux/epoll_wait.c:30
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #1 0x0000558ec6dd7016 in boost::asio::detail::epoll_reactor::run(long, boost::asio::detail::op_queue<boost::asio::detail::scheduler_operation>&) ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #2 0x0000558ec71c6a21 in boost::asio::detail::scheduler::run(boost::system::error_code&) [clone .isra.0] ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #3 0x0000558ec71c7252 in std::thread::_State_impl<std::thread::_Invoker<std::tuple<http::server::cWebem::cWebem(http::server::server_settings const&, std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> > const&)::{lambda()#2}> > >::_M_run() ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #4 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #6 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > Thread 5 (Thread 0x7f3fee5fe6c0 (LWP 1272) "MQTTPush"):
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6d83226 in CMQTTPush::Do_Work() ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > Thread 4 (Thread 0x7f3feedff6c0 (LWP 1271) "InfluxPush"):
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6d7e37e in CInfluxPush::Do_Work() ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.350 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > Thread 3 (Thread 0x7f3fefdfe6c0 (LWP 1270) "PluginMgr"):
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #2 0x0000558ec71226f6 in Plugins::CPluginSystem::Do_Work() ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > Thread 2 (Thread 0x7f3ff05ff6c0 (LWP 1269) "SQLHelper"):
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #0 syscall () at ../sysdeps/unix/sysv/linux/x86_64/syscall.S:38
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #1 0x0000558ec72d60f6 in std::__atomic_futex_unsigned_base::_M_futex_wait_until_steady(unsigned int*, unsigned int, bool, std::chrono::duration<long, std::ratio<1l, 1l> >, std::chrono::duration<long, std::ratio<1l, 1000000000l> >) ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6c6576c in CSQLHelper::Do_Work() ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #3 0x0000558ec7308373 in execute_native_thread_routine ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #4 0x00007f3ff1aa81f5 in start_thread (arg=<optimized out>) at ./nptl/pthread_create.c:442
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #5 0x00007f3ff1b2889c in clone3 () at ../sysdeps/unix/sysv/linux/x86_64/clone3.S:81
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > Thread 1 (Thread 0x7f3ff0e21e00 (LWP 1268) "domoticz"):
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1aee545 in __GI___clock_nanosleep (clock_id=clock_id@entry=0, flags=flags@entry=0, req=0x7ffdd03ccc50, rem=0x7ffdd03ccc50) at ../sysdeps/unix/sysv/linux/clock_nanosleep.c:48
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #1 0x00007f3ff1af2e53 in __GI___nanosleep (req=<optimized out>, rem=<optimized out>) at ../sysdeps/unix/sysv/linux/nanosleep.c:25
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6bc6f75 in sleep_seconds(long) ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #3 0x0000558ec6b49510 in main ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: >
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > Main thread:
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #0 0x00007f3ff1aee545 in __GI___clock_nanosleep (clock_id=clock_id@entry=0, flags=flags@entry=0, req=0x7ffdd03ccc50, rem=0x7ffdd03ccc50) at ../sysdeps/unix/sysv/linux/clock_nanosleep.c:48
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #1 0x00007f3ff1af2e53 in __GI___nanosleep (req=<optimized out>, rem=<optimized out>) at ../sysdeps/unix/sysv/linux/nanosleep.c:25
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #2 0x0000558ec6bc6f75 in sleep_seconds(long) ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > #3 0x0000558ec6b49510 in main ()
2025-09-09 02:14:13.351 [7f3fe6ffd6c0] Error: > [Inferior 1 (process 1268) detached]
Aborted
voyo@swarog:~/domoticz.git$
User avatar
waltervl
Posts: 6689
Joined: Monday 28 January 2019 18:48
Target OS: Linux
Domoticz version: 2025.1
Location: NL
Contact:

Re: Domoticz crash when 'Dummy' is enabled

Post by waltervl »

Did you check the troubleshooting Wiki? Check the database for inconsistencies? https://wiki.domoticz.com/Troubleshooting
Domoticz running on Udoo X86 (on Ubuntu)
Devices/plugins: ZigbeeforDomoticz (with Xiaomi, Ikea, Tuya devices), Nefit Easy, Midea Airco, Omnik Solar, Goodwe Solar
voyo
Posts: 40
Joined: Monday 17 February 2020 19:16
Target OS: Raspberry Pi / ODroid
Domoticz version: beta
Location: Poland
Contact:

Re: Domoticz crash when 'Dummy' is enabled

Post by voyo »

waltervl wrote:Did you check the troubleshooting Wiki? Check the database for inconsistencies? https://wiki.domoticz.com/Troubleshooting
Yes, I explained all my troubleshooting in 1st post. I followed official domoticz wiki for troubleshooting.

Wysłane z mojego SM-S911B przy użyciu Tapatalka

voyo
Posts: 40
Joined: Monday 17 February 2020 19:16
Target OS: Raspberry Pi / ODroid
Domoticz version: beta
Location: Poland
Contact:

Re: Domoticz crash when 'Dummy' is enabled

Post by voyo »

Hey everyone,

Just wanted to post an update on the Domoticz crash I was dealing with and ask the devs to fix this properly.

What was wrong: My Domoticz (V2025.1, build 16611, also tested with 16776 from git development) kept crashing on startup when Dummy hardware was enabled. The logs showed a std::invalid_argument from stoull in CEventSystem::GetCurrentStates()—basically, it was choking while loading device states.

After hours of digging, I found the problem: a Dummy device (ID 1922, "Energy cost", Type 243, SubType 28 - Counter Incremental) had an sValue of "Inf". That’s not a number, and Domoticz couldn’t handle it since it expects a numeric value (like "123" or "123;456") for this device type.

SELECT * FROM DeviceStatus WHERE ID = 1922;
Output:
ID|HardwareID|DeviceID|Unit|Name|Used|Type|SubType|SwitchType|Favorite|SignalLevel|BatteryLevel|nValue|sValue|LastUpdate|Order|AddjValue|AddjMulti|AddjValue2|AddjMulti2|StrParam1|StrParam2|LastLevel|Protected|CustomImage|Description|Options|Color|OrgHardwareID
1922|3|83922|1|Energy cost|1|243|28|3|1|12|255|0|Inf|2025-09-08 22:55:00|1922|0.0|1.0|0.0|1.0|||0|0|0||ValueQuantity:UExO;ValueUnits:UExO||0

Simply fix was (however it took me hours on investigation...)
UPDATE DeviceStatus SET sValue = '0' WHERE ID = 1922;

This stopped the crashes, and Domoticz now starts fine with Dummy enabled.

I’m pretty sure a dzVents script (energy cost.lua) set this bad "Inf" value, maybe from something like math.huge , not sure.

Please fix this. Domoticz needs to be tougher about bad data. Two things:

Check inputs: Don’t let scripts, plugins, or external devices (like via API) write junk values to sValue that don’t match the device type. A quick check before saving to the database would catch this.
Don’t crash: When loading device states, Domoticz should skip or log bad data instead of crashing hard on a stoull error. This would keep things running even if a script messes up.

Big thanks to the Domoticz team for all the work, but I dont have time to dig into the C++ code and make a pull request myself. The logs and SQL I shared should be enough to track this down. Can you guys please make Domoticz more robust so we don’t get burned by bad database values?
Calling @gizmocuz here.

Cheers, Voyo
User avatar
gizmocuz
Posts: 2712
Joined: Thursday 11 July 2013 18:59
Target OS: Raspberry Pi / ODroid
Domoticz version: beta
Location: Top of the world
Contact:

Re: Domoticz crash when 'Dummy' is enabled

Post by gizmocuz »

Thanks for reporting!

In beta version 16781 this exception is now handled.

if people/scripts are storing invalid data anything can happen. (not only by Domoticz, try messing with the Registry in Windows, or config files in /etc under unix)
Quality outlives Quantity!
Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest