Is DC the solution for my problem?

Hey guys!

I’m pretty new in this whole DC thing… I’m facing the following problem and thought that DC might be a solution for it but I’m not sure yet.
We’ve got multiple computers (50-150) all over the country. I’d like to connect them somehow so that everyone is able to access all the data. It would be good if the communication would happen encrypted so I think that I need ADC.
Not every computer (Linux, Mac and Windows) is able to run 24/7. Some of them might only be online for a couple of hours a day - some only once a week - others 24/7. I’d like to have a list of all files that are shared on all computers available at any time. Even if those computers are not running. So users can make a request for a file even when the owner is not online, this request is stored on some server/hub/whatever when the user goes offline, when the computer with the file on it comes online the file will be downloaded to a server and the person who requested the file will download it later when he comes back online. Is something like that possible with (A)DC? We’ve some good programmers and I think they’d be able to write it - if it’s possible and something like this doesn’t exist yet.
In addition it would be nice to have a second list with files stored on external hard drives so that users can request those files too. The owner would have to plug it in and the file would be downloaded to a server again. All computer users/owners know each other pretty good so communication is not a problem and they would plug the drive in.

Well… You are the experts here… So: Is that possible or is DC the wrong protocol for that? Do you have an idea how to do this?
We tried to set up an ADCH++ but we’re still facing some problems (the server won’t answer to any +regme or similar - but we’ll figure that out :smiley:)

This isn’t really possible in the default protocol. I suspect that the easiest way to do this is to write a bot that acts as a regular DC client. It’d download every users’ file lists and keeps them stored locally. This bot could then “pretend” to have all files of all users it has seen, and requesting its file list could provide all the files “merged” into a single large file list (not sure if this is a good idea, as such a list may get incredibly huge). The bot can easily reply to search queries without a huge overhead. Similarly, users can queue files “provided” by this bot, and the bot would reply with something like a “no slots available” error for as long as it doesn’t have the file yet.

So yes, it sounds like that would be possible in a slightly hack’ish way. I’m not aware of the existence of such a bot, and it may require some serious effort to write. One can think of alternative strategies to get this functionality right in the protocol, but I suspect that that is going to be more complex.

I think I’ve read some protocol proposals for this at some point, but I’m not sure about its status.

Whether DC is the right protocol for that: Well, it’s… possible. I’m not aware of a better solution to do what you want, save for having a shared FTP/rsync server that has an insane number of hard drives and to which everybody just uploads their stuff to.

Thanks a lot!

Well we had a root server before. But it gets pretty expensive if you want to store a huge amount of data (I’m talking of about 60-80TB) - so we’re thinking of a different approach :smiley:
Another idea was to upload the files to hosters like rapidshare and download them if needed. Problem here is that the hoster might change its policies and delete files if an account reaches a certain level or won’t keep your files forever…

Okay. So we will have to try to program a client bot now :smiley: Thanks!
And if you should come across those protocol proposals again please let me know where they can be found :wink:

Thanks again! I’m sure we’ll need help with our project soon…
so cya :laughing: