Strange behaviour on Truenas Scale Container instance Topic is solved

On various Hardware and OS systems: pi / windows / routers / nas, etc

Moderator: leecollings

Post Reply
iguaan
Posts: 18
Joined: Thursday 21 May 2020 7:40
Target OS: NAS (Synology & others)
Domoticz version:
Contact:

Strange behaviour on Truenas Scale Container instance

Post by iguaan »

Decided to move from Rpi 3B+ to Truenas Scale docker container.
When domoticz runs on container, data from mysensors wifi gateway ("nrf gw" in log) doesn't always get through, commands are delayed about 20sec.
Same delay for sonoff wifi devices.
When I boot up Pi parallel on different IP, the data on Container gets all the values and commands are triggered instantly.
Running only on Pi has no issues.
There isn't any official app, but I've used "Custom app" function.
So the installation is triggered automatically, added few configurations for repo and port forwarding:

Image Configuration
Image
Repository

domoticz/domoticz
Tag
stable
Pull Policy
Always pull an image even if it is present on the host.

Port Bind Mode
Publish port on the host for external access
Host Port

8081
Container Port
8080
Protocol
TCP
Host IPs

log from startup on docker, Pi is offline:

Code: Select all

2025-08-08 09:55:31.754  Launch: Begin container self-repair
2025-08-08 09:55:31.933  Launch: End container self-repair
2025-08-08 09:55:31.933  Launch: Running customstart.sh
2025-08-08 09:55:31.957  Status: Domoticz V2025.1 (build 16672) (c)2012-2025 GizMoCuz
2025-08-08 09:55:31.957  Status: Build Hash: 7f861f5bd, Date: 2025-05-05 09:31:45
2025-08-08 09:55:31.957  Status: Startup Path: /opt/domoticz/
2025-08-08 09:55:31.984  Sunrise: 05:24:00 SunSet: 21:30:00
2025-08-08 09:55:31.984  Day length: 16:06:00 Sun at south: 13:27:00
2025-08-08 09:55:31.984  Civil twilight start: 04:31:00 Civil twilight end: 22:23:00
2025-08-08 09:55:31.984  Nautical twilight start: 03:08:00 Nautical twilight end: 23:46:00
2025-08-08 09:55:31.984  There is no astronomical twilight in the space of 24 hours
2025-08-08 09:55:31.994  Status: PluginSystem: Started, Python version '3.9.2', 0 plugin definitions loaded.
2025-08-08 09:55:31.997  Active notification Subsystems: email, pushbullet (2/13)
2025-08-08 09:55:31.998  Status: WebServer(HTTP) started on address: :: with port 8080
2025-08-08 09:55:31.999  Status: WebServer(SSL) started on address: :: with port 443
2025-08-08 09:55:32.000  Starting shared server on: :::6144
2025-08-08 09:55:32.000  Status: TCPServer: shared server started...
2025-08-08 09:55:32.000  Status: RxQueue: queue worker started...
2025-08-08 09:55:34.003  Status: nrf gw: Trying to connect to: 192.***.***.***:5003
2025-08-08 09:55:34.005  Status: emaplaat: System: ODroid/Raspberry
2025-08-08 09:55:34.005  Status: emaplaat: Hardware Monitor: Started (OStype Linux)
2025-08-08 09:55:34.005  Status: Wol: Started
2025-08-08 09:55:34.005  Status: NotificationSystem: thread started...
2025-08-08 09:55:34.005  Status: EventSystem: reset all events...
2025-08-08 09:55:34.005  Status: EventSystem: reset all device statuses...
2025-08-08 09:55:34.008  Status: nrf gw: Connected to: 192.***.***.***:5003
2025-08-08 09:55:34.013  nrf gw: Gateway Ready...
2025-08-08 09:55:34.015  Status: nrf gw: Node: 0, Sketch Name: GW
2025-08-08 09:55:34.015  Status: nrf gw: Node: 0, Sketch Version: 1
2025-08-08 09:55:34.015  nrf gw: Gateway Version: 2.3.2
2025-08-08 09:55:34.029  Status: Python EventSystem: Initializing event module.
2025-08-08 09:55:34.029  Status: EventSystem: Started
2025-08-08 09:55:34.029  Status: EventSystem: Queue thread started...
Script created by hpapagaj and modified..."]:79: bad argument #1 to 'pairs' (table expected, got nil)
Script created by hpapagaj and modified..."]:79: bad argument #1 to 'pairs' (table expected, got nil)
Script created by hpapagaj and modified..."]:79: bad argument #1 to 'pairs' (table expected, got nil)
2025-08-08 09:55:34.060  nrf gw: Gateway Version: 2.3.2
2025-08-08 09:55:36.008  emaplaat: Temp (CPU)
2025-08-08 09:55:43.394  nrf gw: General/Custom Sensor (Tarbevee rõhk)
2025-08-08 09:55:47.635  nrf gw: Temp (Kaminast)
2025-08-08 09:55:53.162  nrf gw: Temp (Laepeal sisemine)
2025-08-08 09:55:56.013  emaplaat: General/Percentage (Memory Usage)
2025-08-08 09:55:56.014  emaplaat: General/Custom Sensor (Process Usage)
Script created by hpapagaj and modified..."]:79: bad argument #1 to 'pairs' (table expected, got nil)
2025-08-08 09:56:03.534  nrf gw: Temp (Laepeal sisemine)
2025-08-08 09:56:03.687  nrf gw: General/Custom Sensor (Tarbevee rõhk)
2025-08-08 09:56:08.685  nrf gw: Temp (Kaminast)
User avatar
waltervl
Posts: 6676
Joined: Monday 28 January 2019 18:48
Target OS: Linux
Domoticz version: 2025.1
Location: NL
Contact:

Re: Strange behaviour on Truenas Scale Container instance

Post by waltervl »

Perhaps best to ask this on a truenas scale container forum to know how network protocols are being transfered on this truenas scale platform compared to normal docker platform containers
Domoticz running on Udoo X86 (on Ubuntu)
Devices/plugins: ZigbeeforDomoticz (with Xiaomi, Ikea, Tuya devices), Nefit Easy, Midea Airco, Omnik Solar, Goodwe Solar
iguaan
Posts: 18
Joined: Thursday 21 May 2020 7:40
Target OS: NAS (Synology & others)
Domoticz version:
Contact:

Re: Strange behaviour on Truenas Scale Container instance

Post by iguaan »

It pulls image directly from https://hub.docker.com/r/domoticz/domoticz

Content of docker-compose.yaml

Code: Select all

{# Adjust values to library will pick some things automatically #}
{% do values.update({
  "skip_generic_variables": true,
  "images": {
    "image": {
      "repository": values.image.repository,
      "tag": values.image.tag or "latest",
    }
  },
  "network": {
    "dns_opts": values.dns_config.get("options", []),
    "dns_searches": values.dns_config.get("searches", []),
    "dns_nameservers": values.dns_config.get("nameservers", []),
  }
}) %}

{% for label in values.labels %}
  {% do label.update({"containers": [values.ix_context.app_name]}) %}
{% endfor %}

{# Any manipulation to values should be done before this point #}

{# Template starts here #}
{% set tpl = ix_lib.base.render.Render(values) %}

{# Image Configuration #}
{% set c1 = tpl.add_container(values.ix_context.app_name, "image") %}
{% do c1.set_pull_policy(values.image.pull_policy) %}

{# Container Configuration #}
{% if values.restart_policy == "on-failure" %}
  {% do c1.restart.set_policy(values.restart_policy, values.max_retry_count) %}
{% else %}
  {% do c1.restart.set_policy(values.restart_policy) %}
{% endif %}

{% do c1.set_tty(values.tty) %}
{% do c1.set_stdin(values.stdin) %}

{% if values.hostname %}
  {% do c1.set_hostname(values.hostname) %}
{% endif %}

{% if values.entrypoint %}
  {% do c1.set_entrypoint(values.entrypoint) %}
{% endif %}

{% if values.command %}
  {% do c1.set_command(values.command) %}
{% endif %}

{% for device in values.devices %}
  {% do c1.devices.add_device(device.host_device, device.container_device) %}
{% endfor %}

{% if values.disable_builtin_healthcheck %}
  {% do c1.healthcheck.disable() %}
{% else %}
  {% do c1.healthcheck.use_built_in() %}
{% endif %}

{% do c1.environment.add_env("TZ", values.TZ) %}
{% do c1.environment.add_user_envs(values.envs) %}

{# Network Configuration #}
{% if values.host_network %}
  {% do c1.set_network_mode("host") %}
{% else %}
  {% for port in values.ports %}
    {% do c1.add_port(port) %}
  {% endfor %}
{% endif %}

{# Security Context Configuration #}
{% do c1.set_privileged(values.privileged) %}
{% do c1.clear_caps() %}
{% do c1.remove_security_opt("no-new-privileges") %}

{% do c1.add_caps(values.capabilities.add) %}
{% if values.run_as_custom_user %}
  {% do c1.set_user(values.run_as.user, values.run_as.group) %}
{% endif %}

{% for store in values.storage %}
  {% do c1.add_storage(store.mount_path, store) %}
{% endfor %}

{% if not values.resources.enable_resource_limits %}
  {% do c1.deploy.resources.remove_cpus_and_memory() %}
{% endif %}

{% for portal in values.portals %}
  {% do tpl.portals.add({"bind_mode": "published"}, portal) %}
{% endfor %}

{% do tpl.notes.set_body(values.notes) %}

{{ tpl.render() | tojson }}
First time using docker.
iguaan
Posts: 18
Joined: Thursday 21 May 2020 7:40
Target OS: NAS (Synology & others)
Domoticz version:
Contact:

Re: Strange behaviour on Truenas Scale Container instance

Post by iguaan »

Found the issue and bit awkward.
I have virtual buttons that trigger url and it had reference for domoticz IP on Pi, so when I started Pi service again, commands wen't through.
Guess there wasn't any delay after all, I started/stopped service on Pi and didn't notice to see this association. :oops:
User avatar
waltervl
Posts: 6676
Joined: Monday 28 January 2019 18:48
Target OS: Linux
Domoticz version: 2025.1
Location: NL
Contact:

Re: Strange behaviour on Truenas Scale Container instance

Post by waltervl »

Great it works again as expected. So no truenas issue :)
Domoticz running on Udoo X86 (on Ubuntu)
Devices/plugins: ZigbeeforDomoticz (with Xiaomi, Ikea, Tuya devices), Nefit Easy, Midea Airco, Omnik Solar, Goodwe Solar
Post Reply

Who is online

Users browsing this forum: No registered users and 1 guest