Hi Guys,
A while back I mentioned a possible idea for distribution of files players might need in order to get onto Persistent Worlds and the like.
Eg: Lets imagine the persistent world has 40 hak files which the players need.
The idea was to use something called the Interplanetary File System. Which uses the same sort of technology as Bittorrent (DHT - Distributed Hash Tables)
This is not to attack the usefulness of the vault, but to perhaps augment its usefulness.
Instead of all that content being stored in a single place - eg: Eggs in one basket, the IPFS network stores the relevant data, and it exists for as long as anyone needs the content.
I've developed a tool
http://gaming-content.net/Home/Download
I would rather the download location be hosted here, so bug fixes can occur as they need to.
This application will help manage the task of adding files to and downloading them from the IPFS network.
The tool is written in .Net C# so feel free to decompile - its not like I could stop you anyway. hehe
Features include:
Upload capability:
Point the executable at a folder structure and it will grab all the files under that folder structure and add them to the IPFS network.
At the end of the process, it will give you a single IPFS identifier that will bind all those files together.
Download capability:
Supply the hash that the Upload scenario gave you as its output, and it will generate by default a Config.json file with the identified file extensions in the manifest.
You can then edit this config file to specify where you want each file type to go.
Eg: .hak files to the hak directory
.bmu to the music directory.
This also works very well for decompressed/extracted hak files - if you want to put them in the override directory.
When it is downloading, it will attempt to download the files from various public gateways to the IPFS network, this process in itself will help distribute the content further on the IPFS network, ensuring that the content is online 100% of the time.
In my own tests, I was able to download 4200 files in just around 200 seconds. (It downloads 10 concurrent files at a time)
The use case I am imagining:
A server admin has compiled a list of hak files required for the server he administrates
He copies them into a folder on his desktop (he doesnt have to, but its easier to explain this way)
He points the tool at that directory, and specifies the output file of 'manifest.txt'
His 30-40 hak files are processed, added to the network - and he gets a manifest.txt file
He looks inside it and sees a hash that looks like this:
QmdYGZrySnXswNuzjF2Ls6nK8B77zE6wmB2kcCLiVvNfrd
He then shares this hash with the people who want to play on his server.
The people who want to play on his server receive the hash, and use the tool for the Download scenario
They call it on command line like so:
NwnIpfsUpload.exe "download" "QmdYGZrySnXswNuzjF2Ls6nK8B77zE6wmB2kcCLiVvNfrd"
The first time it runs, if there is no Config.json or a path to one is not specified on the command line - it will generate one based on the manifest.
So if the manifest says there are hak files it will generate a Config that looks like
{
"FileTypeToDirectory":{".hak":"" }
}
The user or the admin could customize the Config to point at their NWN directory
{
"FileTypeToDirectory":{".hak":"C:\NeverwinterNights\hak" }
}
He then re-executes the command line
NwnIpfsUpload.exe "download" "QmdYGZrySnXswNuzjF2Ls6nK8B77zE6wmB2kcCLiVvNfrd"
Then BAM - it will start downloading the hak files 10 at a time, and install them in the correct place.
Feel free to ask any questions:
Some questions I can forsee
1. To upload files what is required:
You will need to download the IPFS client for windows or whatever OS you are on.
Then you will need to run it via IPFS.exe daemon
While this is running, you can then execute the upload steps above.
Note - even if the Upload appears to have finished - it is advisable that you leave your IPFS.exe client running in daemon mode for about 20-30 minutes.
Depending on the amount of files being uploaded, it can take a while for them to travel from your computer to the wider world.
2. How does the IPFS network work?
Basically - anyone running a node/daemon is donating a small amount of filespace to the IPFS network. Don't worry, this does not mean that your computer will be donating other peoples files. The way it works is that in order for content to be pulled onto a Node/Daemon, it needs to be requested through that node/daemon. If no one makes any requests for a file through your node/daemon - then the only files that are hosted on your node/daemon are the ones you are adding to the network yourself.
3. Wait... I thought you said this was being distributed to other peoples computers, not just mine?
Thats right, but before it can get to other peoples computers, it has to start off on your computer. After you have performed the upload steps, and then left your daemon online for about 20-30 minutes, you can turn your daemon offline and execute the download scenario yourself to see that you are downloading the content from the public IPFS Gateways.
4. Whats the benefit of IPFS over a standard website?
In a nutshell, the files you upload exist everywhere, but nowhere at the same time. The more often a file is requested on the IPFS network, it more available it becomes, because it will traverse the network and get left behind as it propagates. The network is also self-cleaning - so if noone has any interest in a file - then that file may eventually just disappear from the network.
5. Wait - I thought you said the files would exist forever?
Well.. forever is a long time - but yes, there is a way to make your files exist forever.
In your IPFS.exe you can execute IPFS.exe pin add 'HASH' (where hash is the hash of the file you want to keep forever)
What this does it keeps the file on your node/daemon forever, regardless of anyone requesting it or not.
If you want to keep your files alive on the IPFS network, they need to be requested once in a while - unfortunately, I do not know how often that would be. I would imagine once a month might be enough?
You may not even need to download the entire files to keep them alive.
An http head request to
http://ipfs.io/ipfs/<HASH> would probably be enough to keep the file alive.
(head requests will request the http headers for the url/document : So instead of downloading a 30mb hak file, you will download a few bytes or kb of data at most)
6. Can I build my own IPFS Client using yours as a starting point?
Sure - just remember me in the credits.
7. I am trying to download, but it seems to be looping on the same files continuously.
Try turning your ipfs.exe daemon on.
This is usually a symptom of the files not having propagated to the public gateways yet. You will generally need to keep the daemon running for 30 minutes or so for large upload tasks, and then I recommend carrying out a download test using your manifest hash with the daemon online also.
Once that download is carried out successfully, you should be able to turn the daemon off and it should work for anyone.
8. Shit ... I uploaded the wrong files....
Unfortunately, IPFS is built on the concept of permanence - anything added to the network will exist for as long as it is requested.
If you uploaded something that you didnt want to upload, just don't share the hash - it will disappear from the network eventually.
If you uploaded sensitive material - don't worry too much - the way the network is built is that nothing is really 'browsable' or 'perusable' on the IPFS network.
The only way to access the content, is if someone has the exact hash that maps to the content you uploaded.
The chances of someone guessing or randomly getting a collision with IPFS hashes is statistically improbable - its more likely a meteorite will collide with the moon, bounce off and hit mercury and then swing around and hit the Earth.
9. OMG - Someone added my hak files without my permission.
Sorry - not my problem - the IPFS network is decentralized and uncensorable. I do ask that people use it responsibly. Ask permission or check license restrictions on content pertaining to redistribution. Sometimes content can be redistributed to your hearts content, all you need to do is ensure the license.txt file is included. It can be that simple. I recommend checking with the author of said content before making an upload decision.
If you have uploaded something, and the Content author would like it removed - the only recourse would be for the content author to ask nicely that everyone stop using the hash. Eventually the file will fade away into obscurity and be cleaned from the network.
10. Could the Vault utilize the IPFS technology in anyway?
Any website could use the IPFS network to augment their delivery of content.
File Storage costs although not going to break the bank, do add up. This is a nice way to keep costs down, while also reducing bandwidth charges.
Instead of everyone downloading terabytes of data from the Vault, they are downloading that data from the Public IPFS Gateways. The vault could even run their own gateway if they wanted and choose what files to serve through it.
10. Whats next?
I am actually building a website that will allow people to upload directly to the IPFS network via drag and drop on a website instead of using the tool.
Its going to use dropzone - a javascript library that will handle the drag and drop of files, and then it will transfer the files either to
1. An IPFS node that runs in the users browser.
or
2. Uploads to my website, where I will add the content to an IPFS node on my server.
11. What about your bandwidth costs?
It is true - option 2 will cost me outgoing bandwidth from my server to the IPFS network, but this would be a one time gig for each file uploaded. Once the files leave my server, they are on the IPFS network and never get requested from my server ever again.
11. What if someone else has uploaded a file I want to use?
The IPFS network is smart - the hashes used to identify files are generated by the file contents (exact binary match).
If someone has already uploaded the same hak file, or same mdl file or 2da etc - you don't have to worry about duplication. Even if they were uploaded with different file names, they will have the same content, and therefore have the same hash. There will never be two hashes that map to the same file.
12. Suggested use case?
As mentioned above - a PW Server Admin could use it to distribute the haks he wants the players to use. At the same time - it can be used as a server installation tool.
Eg: Imagine you are migrating your server to a new box. You could add your entire server folder to the IPFS network, then re-installing at the new location is a case of calling
NWNIpfsUpload.exe download "HASH"
Then it will download server vaults, data folders etc
13. What if a file is already downloaded?
The tool will check the SHA256 hash against the hash listed in the manifest- if it matches, then it will not download it.
If the hash is different, it will overwrite it with the IPFS version. Use this carefully, as it could result in character files (bic) getting set back to the state they were in when added to IPFS.
At the same time - this is a good way of protecting against corruption or restoring files/characters to a backup state.
This tool was put together quite fast, so it is probably going to have a bug or two - please let me know if you find any.
Attachment | Size |
---|---|
![]() | 345.08 KB |
Sheep and Stone |
93.3% |
Dark Avenger Chapter 1 |
92.5% |
The Crystalmist Campaign Chapter 3: The Sahuagin Heel |
84% |
From This Comes Strength |
98.9% |
Vordan's Hero Creator |
98.6% |
Lanterna |
75% |
Ravenloft: Dreamscape |
75% |
Planescape: The Shaper of Dreams |
94.3% |
The Caravan Club |
90% |
Dark Avenger Chapter 1 |
92.5% |