Recovering a lightning wallet without a backup file

Thank you @DaveC for the info I will DM any extra backup request likely by Monday or Tuesday, they’ve been requested

I will try to consolidate this guide for waiting close channels and disaster recovery into this guide: Steps for Waiting Close Channels and Unconfirmed Transactions

For anyone still having an issue please try restarting your lightning node, uploading the backup files again or DM me with any other times or what you’ve tried so far, any other extenuating details of what led you to go down helps

We are working on continuing to streamline this process

Feel free to post in the Bitcoin and Lightning Telegram channel here for any ongoing issues as well

Here is a breakdown of best practices:

I would recommend for Lightning Node App

1 ) If on Pi 4, or any platform be prepared for any contingencies like another MicroSD card or reach out to us if you get any error, many solutions are here (Official Umbrel Troubleshooting Guide and FAQ), ideally we can troubleshoot whatever platform you’re currently on and get your machine functioning ASAP so we do not have to reinstall Lightning Node App, reinstalling Lightning Node App will require us to perform the recovery

2 ) Have your 24 words somewhere secure, if you plan on making any changes to your setup, balance your channels and close them before migrating or to pre-empt any issue, then re-open them however you’d like in an optimized way

2 ) For the non-casual user be confident in offchain channel migration if you would like to attempt this, this is the manual disaster recovery steps we are attempting to assist with detailed there, it’s a technical process requiring not running two instances with the same seed at once and has lots of nuances you should research and test on LND or CLN is a different process

4 ) In some catastrophic event, if you’re relocating, or lost access to your machine you can use your 24 words to recover channels, this will close channels and may require you to work with the peers and follow steps to communicate with those peers to close successfully which can sometimes take a while depending on issues with the peers not being online etc.
This process is Automated Recovery to make this easier and is demonstrated here
We’re working to make this even more reliable!

However, this recovery is always using the SCB file which will require cooperation from peers (information on the backups, when we have issues with the backups it’s a connection issue, or a peer that was a CLN or eclair lightning node, or an offline peer causing a channel to be stuck in waiting close

In different circumstances we can install chantools for the CLN and eclair peers if they are online and use the command triggerforceclose:

As an example to install chantools:

wget https://github.com/lightninglabs/chantools/releases/download/v0.12.0/chantools-linux-arm64-v0.12.0.tar.gz

tar -zxvf chantools-lnux-arm64-v0.12.0.tar.gz

cd [chantools directory]

./chantools --help then populates all the chantools commands on Umbrel tool and asks for your seed which initiates communication…

Then in certain situations, you can run this command:

chantools triggerforceclose --peer [peer@address:port] --channel_point [channel:id]

More resources here:
“When should I use what command?” section GitHub - lightninglabs/chantools: A loose collection of tools all somehow related to lnd and Lightning Network channels.

This is just one example of an issue that can occur with a channel after a disaster recovery,

I understand that’s a technical process so another thing we can do for anyone with stuck waiting close channels (if the first guide linked has not worked)

Is if you’d like even more assistance is please make a post in the LND GitHub Discussions and share the output of your pendingchannels command (once it populates channels it won’t just say 0 but you’ll see them listed here, guide on getting the pendingchannels output) and we can get more expertise on what can be done, please paste the output that command there you can see an example post here

I will work to consolidate this, and we are working to make this process easier soon!