Major fan of this idea. But how does one address the GUI challenges presented by leaving GitHub behind? It can't be understated that GitHub provides an amazing (communal/social) user experience.
You're right, of course. This is just a first step.
One interesting followup idea might be that the BitTorrent library I'm using, webtorrent, also works in browsers over WebRTC. But I'm not using that because I wouldn't know what to do with a git cloned repo inside of a browser tab. Maybe someone else will though. :)
GitHub provides: - Repo hosting, - Search, - Community
In comparison to decentralized Search and Community, decentralized file storage is easy. Conveniently, centralized repo hosting is the biggest problem. Not being able to Search / Comment / Report a bug during a DDOS decreases productivity, but not being able to push commits / run CI tools is a productivity halt.
The best next move, might be to focus on decentralized repository hosting, solve that well, and allow users to conveniently mirror the GitTorrent repos on GitHub. Giving the best of both worlds until Search and Community can also be solved well.
This may mean GitTorrent would need some form of post push hooks (i.e. to update mirrors or run CI). Which I'm sure is doable.
Hmm, decentralized search here should probably just use a Kademlia-like implementation (http://en.wikipedia.org/wiki/Kademlia) where a 'XOR metric' is used to measure distance between nodes. (That way the max number of lookups will be log2(n) where n is the number of nodes (with further optimizations possible))
Obviously someone would need to build a user-friendly interface for all that, etc.
What about using git notes (`man git-notes`) for tracking issues, comments, etc? They are stored as git objects (right?) and could be used for this task?
I thought about using git notes for this and didn't how it is better than adding issues, comments and wiki inside the repo itself. We are used to put documentation and tests alongside the code in our repo, why not add wiki and issues?
You can make an excellent case for that: this would require documentation and tests to be up-to-date before a commit would be accepted by whoever maintains the repo.
It's the same old problem of trying to build a peer-to-peer social network. How do you ensure that large files are distributed correctly and quickly with minimal security implications in an environment where nodes are constantly joining and leaving the network? Perhaps it's possible, but if it were an easy way of doing it, there would be more of that sort of thing around.
> How do you ensure that large files are distributed correctly and quickly with minimal security implications in an environment where nodes are constantly joining and leaving the network?
The owner of a project will keep a node always online to fight its own churn. There is no big files in the case of issue tracker. This is not video social network. Also, video social network do work, e.g. private trackers, the only thing is that they use the web publish magnet links which can be done over a DHT cf. PPSP and tribler.
I guess the projects that can't be on github won't mind the GUI challenges as long as they have some way to have a central repository without having to maintain a server on their own.
Do the same thing you did for the Gittorrent. A GUI client that runs a torrent of a php file that connects to a torrent of a database file. You just need to always be connecting to the latest and greatest.