>>6UJBBUTN (OP) Oh yeah, referring to other threads is still broken, here is the intended link: https://fchan.xyz/b/GT4KLZ21#QMFCOQ9L
Anything related to programming and creating software.
Alright /prog/champs, here is a project for this year: >>fb-QMFCOQ9L → We need to come up with a plan to achieve this utopia.
>>AHC2DSZX wow fchannel is a broken mess >>Y8T7YMQE >Seems like it shouldn't be too hard. famous last words >how are we going to structure it? i am not even sure how this sort of thing is structured we should do it like fchan, initially: some anons create an instance with what they feel is missing on the network (like how jockanon created fitchan) and them expand on that i bet this place is full of weebs, so anime and roms will be easy to come by at the start
>>fprog-BYF6310Y i'm not talking about diversity of data you retarded faggot i'm talking about distribution. a bunch of torrents with different things is never going to be as tolerant as an ipfs where everyone's backing up a piece of the archive that OP is talking about. fucking kill yourself
some mathfag needs to make an algo that will be used to determine how important it is for some data to be backed up so that all the available space on this hydra-backup thing doesn't get used for storing over9000 copies of same shit but that would have to take the reliability of nodes into account that way Sonic OC fanart goes to some shitty Raspberry Pi node that some rando set up while copy of WikiLeaks goes to a VPS that is located in proper datacanter
>>0B832NFX yes, subjective to the user who is running the node and therefore user would have control over what gets stored on the node I imagine some sort of UI that has list of all the shit that is already archived, maybe even sorted into categories that way user who is donating their storage can select specifically and/or generally what they are interested in seeing preserved. algo takes that as an input, checks which of the selected shit is the most appropriate to be downloaded and starts downloading and maybe have a "fuck my shit up" option, that gives control of the node's content to the public, or copies content settings from some other random node and this UI would also have to allow user to upload new shit into the archive >>KV8MOWII make it optional brave dudes can go expose themselves with their bare IP while careful dudes can go through Tor/Lokinet/some custom VPN I hope that we are not re-inventing Kademilla network or similar protocol It might be similar to other file-sharing protocols, but from what I understand, none of those protocols were designed for preservation of the content if shit was popular, it was available on the network, but none of the protocols made sure that file stays available. Except, now that I think about it, PeerTube does that for videos. PeerTube is basically torrents for videos, but there is always at least one seed for the file. The server that is hosting it. So regardless of how popular video is, it is always accessible as long as server stays up. More popularity just means more peers and therefore faster transfer speed for you.
>>6UJBBUTN (OP) Existing technology like BitTorrent can be utilized to its maximum effect, provided the right setup. First, make sure that a node on the network is a tracker and it keeps "tracker agreement"s with other nodes, i.e. if one of the parties in an agreement receives an upload, then both of them will keep track of the seeds. If a new tracker joins in to track a torrent, then the torrent and magnet will be updated on the Download page and the tracker immediately gets to track it. The only problem with this approach is that while the same data is being shared, each torrent file has a drastically different set of trackers. This means that if a few key trackers gets taken down, then potentially all the seeds can be taken down from the network. The way data gets uploaded is by a few different ways: either you are an admin, and you have decided that you want to upload something, or you send a request to the admin in a website form with the data already uploaded to the server which the admin then decide if he wants or not. If the admin does not want random trash to be uploaded to its server, then it can opt for an alternative method: first, the user submits a form which describes the data that he would send, if the admin is interested, then he will notify the user and then the data would be sent over and be uploaded to the node. Once the data is on the node, word gets around with all the nodes which the node that just got the data uploaded to has an agreement with it, and they start tracking it. This plan is like the evil genius's ultimate plan to destroy the internet once for all, but that's also the reason why I don't think this would work: it looks too good on paper to work properly.
>>UO242JNJ >This plan is like the evil genius's ultimate plan to destroy the internet once for all, but that's also the reason why I don't think this would work: it looks too good on paper to work properly. It's merely a way of providing a more sustainable method of archival, as opposed to relying on a central (or handful of different) servers to archive everything online. Right now theres like the wayback machine, archive.md, etc but those are super unreliable if it were to go to censorship requests. A federated archival site would be literally perfect