Lightning node stuck/won't start

After 5.4 update, lighting node won’t start completely.

Have tried the following:

  1. Bitcoin node is 100% synced
  2. Restart of umbrel from both app and command line
  3. Have added “reset-wallet-transactions=true” to lnd.conf (only 1 or 2 transactions for initial receive)
  4. Stop + sudo rm -f .env && sudo ./scripts/configure + start

Seeing this after restart:
Screen Shot 2023-09-05 at 11.46.55 AM
Progress bar doesn’t advance after 24hrs and eventually throws error (ENOTFOUND)

Logs:
umbrel@umbrel:~ $ sudo ~/umbrel/scripts/debug

= Umbrel debug info =

Umbrel version

0.5.4

Flashed OS version

v0.5.4

Raspberry Pi Model

Revision : c03111
Serial : 100000002da67290
Model : Raspberry Pi 4 Model B Rev 1.1

Firmware

May 9 2023 12:16:34
Copyright (c) 2012 Broadcom
version 30aa0d70ab280427ba04ebc718c81d4350b9d394 (clean) (release) (start)

Temperature

temp=46.7’C

Throttling

throttled=0x0

Memory usage

          total        used        free      shared  buff/cache   available

Mem: 3.8G 2.9G 340M 7.0M 571M 829M
Swap: 4.1G 2.0G 2.1G

total: 76.0%
mempool: 26.3%
bitcoin: 25.4%
system: 9.9%
electrs: 6.9%
lightning: 3.2%
core-lightning: 2.6%
jam: 1.1%
btc-rpc-explorer: 0.6%

Memory monitor logs

2023-08-28 01:37:02 Memory monitor running!
2023-08-28 01:37:15 Memory monitor running!
9104 ? S 0:00 bash ./scripts/memory-monitor
Memory monitor is already running
2023-09-03 22:14:26 Memory monitor running!
993 ? S 0:00 bash ./scripts/memory-monitor
Memory monitor is already running
2023-09-03 22:24:37 Memory monitor running!
2023-09-04 16:33:27 Memory monitor running!
2023-09-04 18:28:15 Memory monitor running!

Filesystem information

Filesystem Size Used Avail Use% Mounted on
/dev/root 30G 3.8G 25G 14% /
/dev/sda1 916G 651G 219G 75% /home/umbrel/umbrel

Startup service logs

Sep 04 16:34:18 umbrel umbrel startup[991]: Creating mempool_mariadb_1 … done
Sep 04 16:34:18 umbrel umbrel startup[991]: Creating mempool_web_1 … done
Sep 04 16:34:18 umbrel umbrel startup[991]: Creating lightning_lnd_1 … done
Sep 04 16:34:18 umbrel umbrel startup[991]: Creating lightning_app_proxy_1 … done
Sep 04 16:34:18 umbrel umbrel startup[991]: Creating mempool_api_1 … done
Sep 04 16:34:19 umbrel umbrel startup[991]: Creating mempool_app_proxy_1 … done
Sep 04 16:34:19 umbrel umbrel startup[991]: Creating electrs_tor_1 … done
Sep 04 16:34:20 umbrel umbrel startup[991]: Creating core-lightning_tor_1 … done
Sep 04 16:34:21 umbrel umbrel startup[991]: Creating core-lightning_app_1 … done
Sep 04 16:34:21 umbrel umbrel startup[991]: Creating core-lightning_tor_server_1 … done
Sep 04 16:34:21 umbrel umbrel startup[991]: Creating electrs_tor_server_1 … done
Sep 04 16:34:22 umbrel umbrel startup[991]: Creating bitcoin_tor_server_1 … done
Sep 04 16:34:22 umbrel umbrel startup[991]: Creating bitcoin_app_proxy_1 … done
Sep 04 16:34:22 umbrel umbrel startup[991]: Creating electrs_electrs_1 … done
Sep 04 16:34:22 umbrel umbrel startup[991]: Creating electrs_app_1 …
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating bitcoin_i2pd_daemon_1 … done
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating core-lightning_app_proxy_1 … done
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating core-lightning_c-lightning-rest_1 … done
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating electrs_app_proxy_1 … done
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating bitcoin_bitcoind_1 … done
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating bitcoin_tor_1 … done
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating bitcoin_server_1 …
Sep 04 16:34:23 umbrel umbrel startup[991]: Creating core-lightning_lightningd_1 … done
Sep 04 16:34:32 umbrel umbrel startup[991]: Creating bitcoin_server_1 … done
Sep 04 16:34:34 umbrel umbrel startup[991]: Creating electrs_app_1 … done
Sep 04 16:34:34 umbrel umbrel startup[991]: Umbrel is now accessible at
Sep 04 16:34:34 umbrel umbrel startup[991]: http://umbrel.local
Sep 04 16:34:34 umbrel umbrel startup[991]: http://192.168.1.102
Sep 04 16:34:34 umbrel systemd[1]: Started Umbrel Startup Service.

External storage service logs

Sep 04 16:33:13 umbrel external storage mounter[483]: Running external storage mount script…
Sep 04 16:33:13 umbrel external storage mounter[483]: Found device “WDC WDB NCE0010PNC”
Sep 04 16:33:13 umbrel external storage mounter[483]: Blacklisting USB device IDs against UAS driver…
Sep 04 16:33:13 umbrel external storage mounter[483]: Rebinding USB drivers…
Sep 04 16:33:13 umbrel external storage mounter[483]: Checking USB devices are back…
Sep 04 16:33:13 umbrel external storage mounter[483]: Waiting for USB devices…
Sep 04 16:33:14 umbrel external storage mounter[483]: Waiting for USB devices…
Sep 04 16:33:15 umbrel external storage mounter[483]: Waiting for USB devices…
Sep 04 16:33:16 umbrel external storage mounter[483]: Checking if the device is ext4…
Sep 04 16:33:16 umbrel external storage mounter[483]: Yes, it is ext4
Sep 04 16:33:16 umbrel external storage mounter[483]: Checking filesystem for corruption…
Sep 04 16:33:16 umbrel external storage mounter[483]: e2fsck 1.44.5 (15-Dec-2018)
Sep 04 16:33:17 umbrel external storage mounter[483]: umbrel: clean, 473674/61054976 files, 174488003/244190208 blocks
Sep 04 16:33:17 umbrel external storage mounter[483]: Mounting partition…
Sep 04 16:33:17 umbrel external storage mounter[483]: Checking if device contains an Umbrel install…
Sep 04 16:33:17 umbrel external storage mounter[483]: Yes, it contains an Umbrel install
Sep 04 16:33:17 umbrel external storage mounter[483]: Bind mounting external storage over local Umbrel installation…
Sep 04 16:33:17 umbrel external storage mounter[483]: Bind mounting external storage over local Docker data dir…
Sep 04 16:33:17 umbrel external storage mounter[483]: Bind mounting external storage to /swap
Sep 04 16:33:17 umbrel external storage mounter[483]: Bind mounting SD card root at /sd-card…
Sep 04 16:33:17 umbrel external storage mounter[483]: Checking Umbrel root is now on external storage…
Sep 04 16:33:18 umbrel external storage mounter[483]: Checking /var/lib/docker is now on external storage…
Sep 04 16:33:18 umbrel external storage mounter[483]: Checking /swap is now on external storage…
Sep 04 16:33:18 umbrel external storage mounter[483]: Setting up swapfile
Sep 04 16:33:19 umbrel external storage mounter[483]: Setting up swapspace version 1, size = 4 GiB (4294963200 bytes)
Sep 04 16:33:19 umbrel external storage mounter[483]: no label, UUID=3c18e26b-ff49-42b2-8113-1fd9e4ae61ab
Sep 04 16:33:20 umbrel external storage mounter[483]: Checking SD Card root is bind mounted at /sd-root…
Sep 04 16:33:20 umbrel external storage mounter[483]: Starting external drive mount monitor…
Sep 04 16:33:20 umbrel external storage mounter[483]: Mount script completed successfully!
Sep 04 16:33:20 umbrel systemd[1]: Started External Storage Mounter.

External storage SD card update service logs

– Logs begin at Mon 2023-09-04 16:33:06 UTC, end at Tue 2023-09-05 15:49:14 UTC. –
Sep 04 16:33:27 umbrel systemd[1]: Starting External Storage SDcard Updater…
Sep 04 16:33:27 umbrel external storage updater[913]: Checking if SD card Umbrel is newer than external storage…
Sep 04 16:33:27 umbrel external storage updater[913]: No, SD version is not newer, exiting.
Sep 04 16:33:27 umbrel systemd[1]: Started External Storage SDcard Updater.

Karen logs

% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 8684 100 146 100 8538 61 3601 0:00:02 0:00:02 --:–:-- 3662
{“message”:“Successfully uploaded backup 1693893250112.tar.gz.pgp for backup ID 9cc06b72c0c693e130812ed52625bbd738cbaa7680b26f14be6616cf12859aaa”}

====== Backup success =======

Got signal: backup
karen is getting triggered!
Deriving keys…
Creating backup…
Adding random padding…
1+0 records in
1+0 records out
3156 bytes (3.2 kB, 3.1 KiB) copied, 0.000227014 s, 13.9 MB/s
Creating encrypted tarball…
backup/
backup/channel.backup
backup/.padding
Uploading backup…
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 3957 100 146 100 3811 52 1370 0:00:02 0:00:02 --:–:-- 1422
{“message”:“Successfully uploaded backup 1693906477215.tar.gz.pgp for backup ID 9cc06b72c0c693e130812ed52625bbd738cbaa7680b26f14be6616cf12859aaa”}

====== Backup success =======

Got signal: debug
karen is getting triggered!
Debug result file generated
Got signal: backup
karen is getting triggered!
Deriving keys…
Creating backup…
Adding random padding…
1+0 records in
1+0 records out
902 bytes copied, 0.00022431 s, 4.0 MB/s
Creating encrypted tarball…
backup/
backup/channel.backup
backup/.padding
Uploading backup…
% Total % Received % Xferd Average Speed Time Time Time Current
Dload Upload Total Spent Left Speed
100 1675 100 146 100 1529 45 477 0:00:03 0:00:03 --:–:-- 523
{“message”:“Successfully uploaded backup 1693928447417.tar.gz.pgp for backup ID 9cc06b72c0c693e130812ed52625bbd738cbaa7680b26f14be6616cf12859aaa”}

====== Backup success =======

Docker containers

NAMES STATUS
lightning_app_1 Up About an hour
lightning_app_proxy_1 Up About an hour
lightning_tor_1 Up About an hour
lightning_tor_server_1 Up About an hour
lightning_lnd_1 Restarting (2) 58 seconds ago
bitcoin_server_1 Up 21 hours
electrs_app_1 Up 21 hours
core-lightning_app_1 Up 21 hours
core-lightning_app_proxy_1 Up 21 hours
core-lightning_c-lightning-rest_1 Up 21 hours
core-lightning_tor_1 Up 21 hours
mempool_web_1 Up 21 hours
mempool_api_1 Up 21 hours
core-lightning_lightningd_1 Up 21 hours
mempool_mariadb_1 Up 21 hours
mempool_app_proxy_1 Up 21 hours
mempool_tor_server_1 Up 21 hours
bitcoin_bitcoind_1 Up 21 hours
core-lightning_tor_server_1 Up 21 hours
bitcoin_tor_1 Up 21 hours
electrs_electrs_1 Up 21 hours
bitcoin_app_proxy_1 Up 21 hours
electrs_tor_1 Up 21 hours
electrs_app_proxy_1 Up 21 hours
bitcoin_tor_server_1 Up 21 hours
bitcoin_i2pd_daemon_1 Up 21 hours
electrs_tor_server_1 Up 21 hours
jam_app_proxy_1 Up 21 hours
jam_web_1 Up 21 hours
btc-rpc-explorer_web_1 Up 21 hours
jam_tor_server_1 Up 21 hours
btc-rpc-explorer_app_proxy_1 Up 21 hours
btc-rpc-explorer_tor_server_1 Up 21 hours
nginx Up 21 hours
tor_server Up 21 hours
auth Up 21 hours
manager Up 21 hours
dashboard Up 21 hours
tor_proxy Up 21 hours

Umbrel logs

Attaching to manager
manager | ::ffff:10.21.21.2 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/system/info HTTP/1.0” 304 - “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.21.2 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/apps?installed=1 HTTP/1.0” 304 - “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.21.2 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/system/memory HTTP/1.0” 200 1012 “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.21.2 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/system/update-status HTTP/1.0” 304 - “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.21.2 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/system/storage HTTP/1.0” 200 1020 “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.0.21 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/account/token?token=d8a868e5112fe4e72170830db7465977cb5a950931f2c4f6f42c25765c4a7da4 HTTP/1.1” 200 16 “-” “app-proxy/0.0.1”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.0.10 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/account/token?token=d8a868e5112fe4e72170830db7465977cb5a950931f2c4f6f42c25765c4a7da4 HTTP/1.1” 200 16 “-” “app-proxy/0.0.1”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.0.10 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/account/token?token=d8a868e5112fe4e72170830db7465977cb5a950931f2c4f6f42c25765c4a7da4 HTTP/1.1” 200 16 “-” “app-proxy/0.0.1”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.0.10 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/account/token?token=d8a868e5112fe4e72170830db7465977cb5a950931f2c4f6f42c25765c4a7da4 HTTP/1.1” 200 16 “-” “app-proxy/0.0.1”
manager |
manager | umbrel-manager
manager | ::ffff:10.21.21.2 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/system/get-update HTTP/1.0” 304 - “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
manager |
manager | umbrel-manager

Tor Proxy logs

Attaching to tor_proxy
tor_proxy | Sep 05 06:04:32.000 [warn] Received http status code 404 (“Not found”) from server 51.158.201.235:18256 while fetching “/tor/keys/fp/EFCBE720AB3A82B99F9E953CD5BF50F7EEFC7B97”.
tor_proxy | Sep 05 06:28:21.000 [notice] Heartbeat: Tor’s uptime is 12:00 hours, with 13 circuits open. I’ve sent 14.19 MB and received 23.40 MB. I’ve received 25 connections on IPv4 and 0 on IPv6. I’ve made 13 connections with IPv4 and 0 with IPv6.
tor_proxy | Sep 05 06:28:21.000 [notice] While bootstrapping, fetched this many bytes: 733287 (consensus network-status fetch); 14358 (authority cert fetch); 11915500 (microdescriptor fetch)
tor_proxy | Sep 05 06:28:21.000 [notice] While not bootstrapping, fetched this many bytes: 408645 (consensus network-status fetch); 7356 (authority cert fetch); 1833175 (microdescriptor fetch)
tor_proxy | Sep 05 06:28:21.000 [notice] Heartbeat: Our onion service received 0 v3 INTRODUCE2 cells and attempted to launch 0 rendezvous circuits.
tor_proxy | Sep 05 07:06:33.000 [warn] Received http status code 404 (“Not found”) from server 51.158.201.235:18256 while fetching “/tor/keys/fp/EFCBE720AB3A82B99F9E953CD5BF50F7EEFC7B97”.
tor_proxy | Sep 05 12:28:21.000 [notice] Heartbeat: Tor’s uptime is 18:00 hours, with 9 circuits open. I’ve sent 17.24 MB and received 26.91 MB. I’ve received 26 connections on IPv4 and 0 on IPv6. I’ve made 16 connections with IPv4 and 0 with IPv6.
tor_proxy | Sep 05 12:28:21.000 [notice] While bootstrapping, fetched this many bytes: 733287 (consensus network-status fetch); 14358 (authority cert fetch); 11915500 (microdescriptor fetch)
tor_proxy | Sep 05 12:28:21.000 [notice] While not bootstrapping, fetched this many bytes: 611991 (consensus network-status fetch); 14523 (authority cert fetch); 1995311 (microdescriptor fetch)
tor_proxy | Sep 05 12:28:21.000 [notice] Heartbeat: Our onion service received 0 v3 INTRODUCE2 cells and attempted to launch 0 rendezvous circuits.

App logs

bitcoin

Attaching to bitcoin_server_1, bitcoin_bitcoind_1, bitcoin_tor_1, bitcoin_app_proxy_1, bitcoin_tor_server_1, bitcoin_i2pd_daemon_1
server_1 | umbrel-middleware
server_1 | ::ffff:10.21.0.10 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/bitcoind/info/connections HTTP/1.1” 304 - “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
server_1 |
server_1 | umbrel-middleware
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
bitcoind_1 | 2023-09-05T15:37:15Z Socks5() connect to 51.174.124.248:8333 failed: general failure
server_1 | ::ffff:10.21.0.10 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/bitcoind/info/stats HTTP/1.1” 200 126 “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
bitcoind_1 | 2023-09-05T15:40:39Z New outbound peer connected: version: 70016, blocks=806341, peer=249 (block-relay-only)
bitcoind_1 | 2023-09-05T15:41:59Z Socks5() connect to 2600:6c50:437f:85bc:928:4a29:724c:3067:8333 failed: general failure
server_1 |
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
bitcoind_1 | 2023-09-05T15:42:21Z New outbound peer connected: version: 70016, blocks=806341, peer=251 (block-relay-only)
bitcoind_1 | 2023-09-05T15:43:22Z Socks5() connect to 172.56.105.52:8333 failed: general failure
app_proxy_1 | Validating token: d8a868e5112f …
server_1 | umbrel-middleware
i2pd_daemon_1 | 15:49:37@272/error - Garlic: Failed to decrypt message
bitcoind_1 | 2023-09-05T15:43:55Z Saw new header hash=00000000000000000003a444b70366765359ec534ca485b35481d67ea298661a height=806342
bitcoind_1 | 2023-09-05T15:43:55Z [net] Saw new cmpctblock header hash=00000000000000000003a444b70366765359ec534ca485b35481d67ea298661a peer=136
tor_1 | Sep 05 14:55:35.000 [notice] Closed 1 streams for service [scrubbed].onion for reason resolve failed. Fetch status: No more HSDir available to query.
tor_1 | Sep 05 14:57:40.000 [notice] Have tried resolving or connecting to address ‘[scrubbed]’ at 3 different places. Giving up.
server_1 | ::ffff:10.21.0.10 - - [Tue, 05 Sep 2023 15:49:18 GMT] “GET /v1/bitcoind/info/sync HTTP/1.1” 200 103 “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
i2pd_daemon_1 | 15:49:39@272/error - Garlic: Failed to decrypt message
bitcoind_1 | 2023-09-05T15:43:57Z UpdateTip: new best=00000000000000000003a444b70366765359ec534ca485b35481d67ea298661a height=806342 version=0x31830000 log2_work=94.400662 tx=889604623 date=‘2023-09-05T15:43:45Z’ progress=1.000000 cache=294.8MiB(2084641txo)
bitcoind_1 | 2023-09-05T15:45:14Z New outbound peer connected: version: 70016, blocks=806342, peer=252 (block-relay-only)
tor_1 | Sep 05 15:03:51.000 [notice] Have tried resolving or connecting to address ‘[scrubbed]’ at 3 different places. Giving up.
tor_1 | Sep 05 15:04:25.000 [notice] Have tried resolving or connecting to address ‘[scrubbed]’ at 3 different places. Giving up.
i2pd_daemon_1 | 15:49:39@272/error - Garlic: Failed to decrypt message
bitcoind_1 | 2023-09-05T15:46:16Z Socks5() connect to 2xc5ior4k6kra4utuoigdqobggxhxesdchb2o4t27xupmlvqf47ugyyd.onion:8333 failed: host unreachable
tor_1 | Sep 05 15:04:36.000 [notice] Closed 1 streams for service [scrubbed].onion for reason resolve failed. Fetch status: No more HSDir available to query.
i2pd_daemon_1 | 15:49:42@272/error - Garlic: Failed to decrypt message
server_1 |
app_proxy_1 | Validating token: d8a868e5112f …
tor_1 | Sep 05 15:06:43.000 [notice] Closed 1 streams for service [scrubbed].onion for reason resolve failed. Fetch status: No more HSDir available to query.
tor_1 | Sep 05 15:28:11.000 [notice] Have tried resolving or connecting to address ‘[scrubbed]’ at 3 different places. Giving up.
server_1 | umbrel-middleware
app_proxy_1 | Validating token: d8a868e5112f …
tor_1 | Sep 05 15:29:34.000 [notice] Application asked to connect to port 0. Refusing.
tor_1 | Sep 05 15:41:59.000 [notice] Have tried resolving or connecting to address ‘[scrubbed]’ at 3 different places. Giving up.
tor_1 | Sep 05 15:43:22.000 [notice] Have tried resolving or connecting to address ‘[scrubbed]’ at 3 different places. Giving up.
i2pd_daemon_1 | 15:49:42@272/error - Garlic: Failed to decrypt message
i2pd_daemon_1 | 15:49:47@272/error - Garlic: Failed to decrypt message
i2pd_daemon_1 | 15:49:48@272/error - Garlic: Failed to decrypt message
i2pd_daemon_1 | 15:49:48@272/error - Garlic: Failed to decrypt message
i2pd_daemon_1 | 15:49:48@272/error - Garlic: Failed to decrypt message
i2pd_daemon_1 | 15:49:48@272/error - Garlic: Failed to decrypt message

lightning

Attaching to lightning_app_1, lightning_app_proxy_1, lightning_tor_1, lightning_tor_server_1, lightning_lnd_1
app_1 | Checking LND status…
app_1 | [backup-monitor] Checking channel backup…
app_1 | [backup-monitor] Sleeping…
app_1 | Waiting for LND…
app_1 | Checking LND status…
app_1 | ::ffff:10.21.0.21 - - [Tue, 05 Sep 2023 15:49:38 GMT] “GET /v1/lnd/info/status HTTP/1.1” 304 - “-” “Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/115.0.0.0 Safari/537.36”
app_1 |
app_1 | umbrel-lightning
app_1 | Waiting for LND…
app_1 | Checking LND status…
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
app_proxy_1 | Validating token: d8a868e5112f …
lnd_1 | 2023-09-05 15:49:44.458 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
lnd_1 | 2023-09-05 15:49:44.458 [INF] LTND: Opening the main database, this might take a few minutes…
lnd_1 | 2023-09-05 15:49:44.458 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
lnd_1 | panic: freepages: failed to get all reachable pages (page 23065: multiple references)
lnd_1 |
lnd_1 | goroutine 197 [running]:
lnd_1 | go.etcd.io/bbolt.(*DB).freepages.func2()
lnd_1 | go.etcd.io/bbolt@v1.3.6/db.go:1056 +0x8c
lnd_1 | created by go.etcd.io/bbolt.(*DB).freepages
lnd_1 | go.etcd.io/bbolt@v1.3.6/db.go:1054 +0x13c
tor_1 | Sep 05 14:49:26.000 [notice] Bootstrapped 55% (loading_descriptors): Loading relay descriptors
tor_1 | Sep 05 14:49:28.000 [notice] Bootstrapped 60% (loading_descriptors): Loading relay descriptors
tor_1 | Sep 05 14:49:28.000 [notice] Bootstrapped 68% (loading_descriptors): Loading relay descriptors
tor_1 | Sep 05 14:49:29.000 [notice] Bootstrapped 75% (enough_dirinfo): Loaded enough directory info to build circuits
tor_1 | Sep 05 14:49:29.000 [notice] Bootstrapped 80% (ap_conn): Connecting to a relay to build circuits
tor_1 | Sep 05 14:49:29.000 [notice] Bootstrapped 85% (ap_conn_done): Connected to a relay to build circuits
tor_1 | Sep 05 14:49:30.000 [notice] Bootstrapped 89% (ap_handshake): Finishing handshake with a relay to build circuits
tor_1 | Sep 05 14:49:30.000 [notice] Bootstrapped 90% (ap_handshake_done): Handshake finished with a relay to build circuits
tor_1 | Sep 05 14:49:30.000 [notice] Bootstrapped 95% (circuit_create): Establishing a Tor circuit
tor_1 | Sep 05 14:49:31.000 [notice] Bootstrapped 100% (done): Done

mempool

Attaching to mempool_web_1, mempool_api_1, mempool_mariadb_1, mempool_app_proxy_1, mempool_tor_server_1
api_1 | Sep 5 14:27:11 [128] INFO: Core index ‘txindex’ is synced. Best block height is 806337
api_1 | Sep 5 14:27:11 [128] INFO: Core index ‘basic block filter index’ is synced. Best block height is 806337
api_1 | Sep 5 14:27:20 [128] INFO: Latest BTC fiat averaged price: {“time”:1693846800,“USD”:25695,“EUR”:24317,“GBP”:22184,“CAD”:34954,“CHF”:22849,“AUD”:40326,“JPY”:3794850}
api_1 | Sep 5 14:30:18 [128] ERR: [Lightning] $runTasks() error: Request failed with status code 500
api_1 | Sep 5 14:31:24 [128] ERR: Exception in $logStatsDaily. Reason: connect ECONNREFUSED 10.21.21.9:8080
api_1 | Sep 5 15:28:43 [128] INFO: Core index ‘txindex’ is synced. Best block height is 806340
api_1 | Sep 5 15:28:43 [128] INFO: Core index ‘basic block filter index’ is synced. Best block height is 806340
api_1 | Sep 5 15:28:56 [128] INFO: Latest BTC fiat averaged price: {“time”:1693846800,“USD”:25695,“EUR”:24345,“GBP”:22239,“CAD”:35023,“CHF”:22848,“AUD”:40314,“JPY”:3802375}
api_1 | Sep 5 15:30:06 [128] ERR: [Lightning] $runTasks() error: connect EHOSTUNREACH 10.21.21.9:8080
api_1 | Sep 5 15:31:28 [128] ERR: Exception in $logStatsDaily. Reason: connect EHOSTUNREACH 10.21.21.9:8080
app_proxy_1 | yarn run v1.22.19
app_proxy_1 | $ node ./bin/www
app_proxy_1 | [HPM] Proxy created: / → http://10.21.21.26:3006
app_proxy_1 | Waiting for 10.21.21.26:3006 to open…
app_proxy_1 | mempool is now ready…
app_proxy_1 | Listening on port: 3006
web_1 | /var/www/mempool/browser/resources
mariadb_1 | 2023-09-04 18:28:57 0 [Note] InnoDB: Loading buffer pool(s) from /var/lib/mysql/ib_buffer_pool
mariadb_1 | 2023-09-04 18:28:57 0 [Note] Server socket created on IP: ‘::’.
mariadb_1 | 2023-09-04 18:28:57 0 [Warning] ‘proxies_priv’ entry ‘@% root@c332845d23ac’ ignored in --skip-name-resolve mode.
mariadb_1 | 2023-09-04 18:28:58 0 [Note] Reading of all Master_info entries succeeded
mariadb_1 | 2023-09-04 18:28:58 0 [Note] Added new Master_info ‘’ to hash table
mariadb_1 | 2023-09-04 18:28:58 0 [Note] mysqld: ready for connections.
mariadb_1 | Version: ‘10.5.12-MariaDB-1:10.5.12+maria~focal’ socket: ‘/run/mysqld/mysqld.sock’ port: 3306 mariadb.org binary distribution
mariadb_1 | 2023-09-04 18:28:58 3 [Warning] Aborted connection 3 to db: ‘unconnected’ user: ‘unauthenticated’ host: ‘10.21.21.27’ (This connection closed normally without authentication)
mariadb_1 | 2023-09-04 18:28:58 4 [Warning] Aborted connection 4 to db: ‘unconnected’ user: ‘unauthenticated’ host: ‘10.21.21.26’ (This connection closed normally without authentication)
mariadb_1 | 2023-09-04 18:29:02 0 [Note] InnoDB: Buffer pool(s) load completed at 230904 18:29:02

==== Result ====

The debug script did not automatically detect any issues with your Umbrel.

All other apps appear to be working. Out of ideas at this point, so could use some assistance.
Thanks!

Some follow-up. Seeing this pattern repeat over and over when tailing app-data/lightning/data/lnd/logs/bitcoin/mainnet/lnd.log

2023-09-05 16:07:47.248 [INF] LTND: Version: 0.16.4-beta commit=v0.16.4-beta, build=production, logging=default, debuglevel=info
2023-09-05 16:07:47.249 [INF] LTND: Active chain: Bitcoin (network=mainnet)
2023-09-05 16:07:47.257 [INF] RPCS: RPC server listening on 0.0.0.0:10009
2023-09-05 16:07:47.303 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
2023-09-05 16:07:47.304 [INF] LTND: Opening the main database, this might take a few minutes…
2023-09-05 16:07:47.305 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
2023-09-05 16:07:47.946 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: waiting to start, RPC services not available
2023-09-05 16:07:49.032 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: waiting to start, RPC services not available
2023-09-05 16:08:51.889 [INF] LTND: Version: 0.16.4-beta commit=v0.16.4-beta, build=production, logging=default, debuglevel=info
2023-09-05 16:08:51.889 [INF] LTND: Active chain: Bitcoin (network=mainnet)
2023-09-05 16:08:51.897 [INF] RPCS: RPC server listening on 0.0.0.0:10009
2023-09-05 16:08:51.901 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
2023-09-05 16:08:51.901 [INF] LTND: Opening the main database, this might take a few minutes…
2023-09-05 16:08:51.902 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
2023-09-05 16:09:53.233 [INF] LTND: Version: 0.16.4-beta commit=v0.16.4-beta, build=production, logging=default, debuglevel=info
2023-09-05 16:09:53.234 [INF] LTND: Active chain: Bitcoin (network=mainnet)
2023-09-05 16:09:53.239 [INF] RPCS: RPC server listening on 0.0.0.0:10009
2023-09-05 16:09:53.244 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
2023-09-05 16:09:53.245 [INF] LTND: Opening the main database, this might take a few minutes…
2023-09-05 16:09:53.245 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
2023-09-05 16:10:54.659 [INF] LTND: Version: 0.16.4-beta commit=v0.16.4-beta, build=production, logging=default, debuglevel=info
2023-09-05 16:10:54.659 [INF] LTND: Active chain: Bitcoin (network=mainnet)
2023-09-05 16:10:54.662 [INF] RPCS: RPC server listening on 0.0.0.0:10009
2023-09-05 16:10:54.666 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
2023-09-05 16:10:54.672 [INF] LTND: Opening the main database, this might take a few minutes…
2023-09-05 16:10:54.672 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
2023-09-05 16:11:55.823 [INF] LTND: Version: 0.16.4-beta commit=v0.16.4-beta, build=production, logging=default, debuglevel=info
2023-09-05 16:11:55.824 [INF] LTND: Active chain: Bitcoin (network=mainnet)
2023-09-05 16:11:55.829 [INF] RPCS: RPC server listening on 0.0.0.0:10009
2023-09-05 16:11:55.836 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
2023-09-05 16:11:55.836 [INF] LTND: Opening the main database, this might take a few minutes…
2023-09-05 16:11:55.837 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
2023-09-05 16:12:59.304 [INF] LTND: Version: 0.16.4-beta commit=v0.16.4-beta, build=production, logging=default, debuglevel=info
2023-09-05 16:12:59.304 [INF] LTND: Active chain: Bitcoin (network=mainnet)
2023-09-05 16:12:59.315 [INF] RPCS: RPC server listening on 0.0.0.0:10009
2023-09-05 16:12:59.368 [INF] RPCS: gRPC proxy started at 0.0.0.0:8080
2023-09-05 16:12:59.375 [INF] LTND: Opening the main database, this might take a few minutes…
2023-09-05 16:12:59.375 [INF] LTND: Opening bbolt database, sync_freelist=false, auto_compact=false
2023-09-05 16:12:59.517 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: waiting to start, RPC services not available
2023-09-05 16:13:00.577 [ERR] RPCS: [/lnrpc.Lightning/GetInfo]: waiting to start, RPC services not available

Hmmm this is strange- nothing stands out for me either.

Do you get any errors/issues from the command line after running the following?

sudo ~/umbrel/scripts/app restart lightning

@smolgrrr - Thanks for taking the time to have a look.

No errors when I run that command:

sudo ~/umbrel/scripts/app restart lightning
[sudo] password for umbrel:
Stopping app lightning…
Stopping lightning_app_1 … done
Stopping lightning_app_proxy_1 … done
Stopping lightning_tor_1 … done
Stopping lightning_tor_server_1 … done
Stopping lightning_lnd_1 … done
Going to remove lightning_app_1, lightning_app_proxy_1, lightning_tor_1, lightning_tor_server_1, lightning_lnd_1
Removing lightning_app_1 … done
Removing lightning_app_proxy_1 … done
Removing lightning_tor_1 … done
Removing lightning_tor_server_1 … done
Removing lightning_lnd_1 … done
Starting app lightning…
Executing hook: /home/umbrel/umbrel/app-data/lightning/hooks/pre-start
Creating lightning_lnd_1 … done
Creating lightning_tor_server_1 … done
Creating lightning_tor_1 … done
Creating lightning_app_proxy_1 … done
Creating lightning_app_1 … done
umbrel@umbrel:~ $

Same loading screen for the app (shown above) after running the command.

So it appears that the container might be stuck in a restart loop.

lightninglabs/lnd:v0.16.4-beta “lnd --configfile=/d…” 3 days ago Restarting (2) 13 seconds ago

Attempting to access the container gives a similar result:
docker exec CONTAINER_ID -it bash
Error response from daemon: Container CONTAINER_ID is restarting, wait until the container is running

Any ideas on how to troubleshoot the startup?

Interesting, can you please share the output of the following.

cat ~/umbrel/app-data/lightning/data/lnd/umbrel-lnd.conf

cat ~/umbrel/app-data/lightning/data/lnd/umbrel-lnd.conf

This file is automatically generated by Umbrel, any changes will be overwritten.

Generated at: 2023-09-09T17:14:51.862Z

tlsautorefresh=true
sync-freelist=false
payments-expiration-grace-period=30m
maxpendingchannels=1
minchansize=20000
maxchansize=16777215
coop-close-target-confs=6
stagger-initial-reconnect=false
max-cltv-expiry=2016
max-commit-fee-rate-anchors=10
accept-amp=true
gc-canceled-invoices-on-startup=false
gc-canceled-invoices-on-the-fly=false
allow-circular-route=false
color=#3399FF
bitcoin.defaultchanconfs=2
bitcoin.basefee=1000
bitcoin.feerate=1
bitcoin.timelockdelta=80
watchtower.active=false
wtclient.active=false
wtclient.sweep-fee-rate=10
routerrpc.minrtprob=0.01
routerrpc.apriori.hopprob=0.6
routerrpc.apriori.weight=0.5
routerrpc.apriori.penaltyhalflife=1h
routerrpc.attemptcost=100
routerrpc.attemptcostppm=1000
routerrpc.maxmchistory=1000
caches.channel-cache-size=20000
protocol.wumbo-channels=false
db.bolt.auto-compact=false
db.bolt.auto-compact-min-age=168h
rpcmiddleware.enable=true
routing.strictgraphpruning=false

@smolgrrr - Appreciate you checking back on this one. I provided the output you requested (above), just in case you see something that might be helpful in the future.

In the meantime, I decided to just go ahead an nuke my lightning node and use the recovery steps outlined here:

The re-install successfully restored my lightning node and wallet balance, so things are looking good now.

Thanks again for your help!

Great outcome! You have a nice week