Bittorrent's 'Project Maelstrom' Aims To Deliver All Websites Through Torrents

Status
Not open for further replies.

dovah-chan

Honorable
Indeed it is a novel concept but it doesn't solve the congestion issues that North American ISPs suffer from. That is their fault for (purposefully) not properly maintaining their infrastructure and giving enough, if any customer service and support.

But still, even with bittorrent being a peer to peer protocol, they don't seem to want to mention the existence of seed boxes. While they are not quite like a dedicated server, essentially they act as one as they are usually the big seeders in a lot of torrents and host as well as seed the file being shared. Also if websites were peer to peer then it would take a long time for sites to load.

Think of it like this: let's compare a dedicated server to RAM. GDDR RAM is great for large and continuous transfer of big amounts of data, but doesn't have the same snappy feeling and optimization for many small programs transferring small amounts of data such as DRR DRAM. The same goes with the bittorrent protocol and the http protocol. It takes a noticeable amount of time for a torrent to get up to speed but you can download a 213kb jpeg in your browser in a snap.

This is due to the phenomena of the TCP handshake. In http, it's just you and a dedicated server. In bittorrent you have to ping a larger number of peers and then begin receiving data from them.

Anyways, if this does get large amounts of support then these issues could possibly be resolved but companies are unwilling to just give up the millions of dollars they invest into their servers.
 

jdwii

Splendid


Very good explanation Dovah-chan i couldn't explain it better myself. Perhaps they should let you work here?
Edit if i did add anything to your statement it would be upload speeds are still a problem here in the U.S(not sure around the world)

If this is how the web worked what if they were trying to pull information from a exede base customer with high ping times. Not to mention will this eat bandwidth from normal users since we are all getting caps on are limit.
 

wavetrex

Distinguished
Jul 6, 2006
254
0
18,810
What about dynamic content (which is pretty much 99.9% of the web today)?

The page that is delivered is customized to the user that is accessing it, and not "static" like the old pure HTML websites.
How torrents would solve this ? It won't obviously...

Maybe they want to use torrents for larger stuff like pictures or videos, but again... instead of loading a picture almost instantly you would have to wait until your host pings everyone else in the swarm and ask for the pieces.

Not really a good idea to be honest, the company is daydreaming...
 

bit_user

Polypheme
Ambassador
A distributed P2P web sounds great, but ISPs can and will keep it from ever getting far off the ground.

Dovah, it's possible to design a fully distributed system, where you aren't reliant on any single seed. The problem is reliability. As long as the content is popular enough, you can get enough copies of all the blocks to be distributed among the peers that viewers of the content won't have much trouble assembling a complete copy. But it would only tend to work for short-lived, fairly popular content.

How do you convince people to devote their HDD space & bandwidth to storing & serving random chunks of content, you ask? Much like bittorrent works today, peers would favor those who serve them more blocks. So, the more you store and seed, the better your download speeds will tend to be.

That said, I revert to my original point: the ISPs are too powerful. Also, wavetrex has a good point about dynamic & personalized content.
 


Stop blaming it on the ISPs and start blaming it on the tools that want to create a square wheel just to rebel against the man.

Static files can be delivered wonderfully through the bit torrent protocol because it doesn't matter where the data comes from, what order it arrives in, how long it takes, how much jitter there is, what the skew between sequential blocks is, etc...

Dynamic or time sensitive data on the other hand has not ever and will not ever work properly over bit torrent. Client developers have been integrating streaming services into their clients for years and none of them work properly because the stream requires data to arrive in a certain order and within a certain time window. Since the torrent client loads blocks in a best effort fashion it can never stay steady long enough to calculate and fill a buffer.

This will also be a complete nightmare for synchronous and encrypted connections that expect data to follow a predictable route with minimal jitter and minimal packet loss.

A distributed P2P web sounds great only to coffee-shop hipsters who have no idea what's actually involved and have gleaned all of their insight from blog posts written by journalists who have misunderstood something that they read about on wikipedia or buzzfeed. For those of us who actually know how things work, this joins the ranks of ridiculously stupid ideas alongside "solar roadways" and "thorium powered cars".

In fact, there's already a very similar implementation of this available called Freenet. Unsurprisingly it works with only static content and is used primarily to distribute illegal material.
 

Bezzell

Honorable
May 13, 2013
586
0
11,160
The internet sprouted due to the distribution of illegal material. If I recall correctly, every BBS I ever visited in the 80s was filled with porn and games. I guess it was all a horrible mistake. The world is full of black and white, with a lot of grey areas. Yin and Yang.

I actually agree BT is a fun, but horrible idea. It's no excuse to play the "ethics" card. If somebody robs a bank, you prosecute the bank robber, not the road he drove in on.
 

yhikum

Honorable
Apr 1, 2013
96
0
10,660
The idea of distributed delivery of information is not new. The whole idea is what helped build Internet relies on multitude of servers and routers to ensure delivery of content.

This idea aside, we already have Bittorent like service called Tor. Very useful for anonymous browsing, not so good for speeds or dynamic content delivery. Is Bittorent trying to claim what Tor has already accomplished?
 


You say that as though it's a bad thing...

Why wouldn't you be in support of anything that's going to push the boundaries of what technology can do, creating incentive for better technology that can achieve these things?
 


Yes. It doesn't work in all cases, or even in many cases. What invention does?

Do remember that 'illegal material' includes such things as, oh, any news of the outside world to China... or communication between dissenters in Egypt. Would there be a lot of issues with a P2P internet? Yes. Would there be a lot of potentially valuable ways to use it? Yes.
 

koga73

Distinguished
Jan 23, 2008
405
0
18,780
I agree with wavetrex. As a web developer we are constantly making tweaks to pages and updating content. If your downloading from peers then your going to get an old static page. I just don't see how this would be feasible at all.
 

bit_user

Polypheme
Ambassador
There have been numerous, documented cases of ISPs throttling & even blocking bittorrent. ISPs simply want to transmit the least data for the greatest price, because more data = frequent backbone & switch upgrades. 'nuff said.

Thank you. I was trying to remember the name.

I think we all agree that it will work for some things, but not others. However, I think you're too dismissive about dynamic & user-generated content. Usenet is a very early example of a fully distributed framework for distributing such content, so clearly it's doable (and hopefully better).

To your point about low-latency realtime streaming content, I don't think anyone said it would work for that. I certainly didn't, and I wouldn't agree with anyone who did. However, if you were willing to accept a bit of latency, there's no reason a very slightly modified version bittorrent couldn't work tremendously for broadcasts.

Finally, I think you're awfully dismissive about "the man", as though it's an entirely fictitious concern of the paranoid. There are many countries with repressive governments that actively block websites, censor/limit communications, and carry out a variety of surveillance & enforcement operations on their citizens.
 


Nothing on usenet was dynamic, it just had the illusion of being dynamic and the only reason that it existed in the way that it did is because inter-network communication was incredibly immature compared to intra-network communication (it still is in many ways). Usenet servers had to announce and push/pull content from each other that they wished to serve to their userbase. There was often a significant delay between the time that a message become available on one server, and the time that it was available on another server. Fortunately usenet has rather loose timing constraints so it wasn't a big deal for this kind of content. It's slightly analogous to the way CDNs establish presences on major networks near major population areas and then relay frequently accessed static content for commercial subscribers to improve throughput and reduce bandwidth costs. It's far, far more efficient than trying to shove massive amounts of data across the country across multiple unreliable interconnects.

There already is an excellent method of handling streaming broadcasts, it's called IP multicast and it works quite well as is. Most major live TV broadcasts such as sporting events use IP Multicast. Streaming content using any P2P protocol causes far, far more problems than it solves. Peers are inherently unreliable from a time-sensitivity perspective, it just can't be done.

I'm dismissing concerns about "the man" because it is largely a fictitious concern of the paranoid. Any regime that seriously wants to restrict telecommunication access to the outside world is going to do just that. Make no mistake, if a regime wants to block encrypted traffic, inspect relayed traffic, or allow only unencrypted HTTP 1.1 traffic they really won't have to try very hard to do so. Tor and similar ventures do a reasonably good job of obscuring traffic patterns but the impact on network performance is immense and it only gets worse as more users adopt it.
 

bit_user

Polypheme
Ambassador
No, it cannot. Three reasons for this: first, a streamed game must be run on a server, somewhere. So that pretty much defeats the point of decentralized communication. Second, since the game experience is unique to each player, there are no economies of scale provided by distributing it. Third, gameplay is very sensitive to latency (aka lag), and the best way to combat that is to have the shortest possible path between the client and server.

Bittorrent is cool tech. Since you're interested enough to read this article and post, I'd encourage you to read up on it. Perhaps then you'll gain a better appreciation for why it's a poor solution to game streaming.
 

cypeq

Distinguished
Nov 26, 2009
371
2
18,795
I don't see problem of time that everyone is mentioning here http is much slower than torrent in transferring data. Websites won't be moved to torrent cloud as whole that's not the point. You wouldn't connect to node and paitently wait for someone with the content you want. Websites are on the server and will remain there. You would obviously connect to Webside server as primary node immediately.

More static websites like for example toms hardware could afford stuff like content preaching

Immediate content would be provided from www server but for example images that are page lower would be qurerried from p2p.

This won't work well for all websites for example facebook with has megatons of dynamic personalized data won't benefit all that much.

But
 

onionjohn

Reputable
Nov 6, 2014
2
0
4,510
I don't think any large corporations is going to take this p2p internet thing too seriously.
Another problem of this project is that people using it is storing THEIR stuff (webpages, software installers, copyrighted stuff) in some unknown computer, encrypted or not, it will be quite dangerous(?)
Hacks and malware will also spread much easily as there are so many more independent host of the same file, one infected host is enough. Of course you can use checksums to check the integrity of the file, but what if you re transfering 2000 files of 1MB, it will take ages to generate the checksums for each file.
There are too many things needed to be solved before we can replace http based internet with p2p based ones.
 

bit_user

Polypheme
Ambassador
I'm not sure about that. I think the idea was to make it completely serverless. Otherwise, it doesn't seem that interesting.
 

bit_user

Polypheme
Ambassador
Bittorrent already does this, and I'm not aware of any known examples of content being tampered with. When you download a file via bittorrent, you're getting chunks of that file from a random peers around the world. The reason it's safe is that the .torrent file contains checksums for each chunk. As a client receives the chunks, the client validates the content against the checksums. And these checksums are crypto strength, meaning it's not feasible to compute an alteration to the content that wouldn't change the checksum.

So, as long as the .torrent file comes from a trusted source, you don't need to trust any of the peers.
 
Status
Not open for further replies.