Remember why we were able to use Skype for this?
Pepperidge farm remembers!
Joke aside I transfered a lot of files inside instant messengers and they worked quite well. Nearly everyone had at least a yahoo/messenger/skype/icq account, which made this rather simple, and, because nobody had the capacity/wasn't insterested/was actually p2p, it was perfectly fine. A bummer if the modem connection went down or you had to hang up because the family wanted to make a call, but hey, it was glorious. (no, this is not sarcasm, it really did work.)
I recently tried to send a file to a small group of friends on Facebook. After uploading, it rejected the file because it was a video with a copyrighted song playing in the background.
"This file cannot be sent, since its content cannot be confirmed to be safe and virus-free". Maybe not today, but it's not an inconceivable next step...
You guys are acting like it's such a big deal to steno it into a word document by base26-ing it into sentences as initialisms using markov chains. Facebook sends my 14 megabyte word files no problem (to transfer a 700 kb wav file). I mean sure it was easier on me or my recipients when I could just dump the base64 in there. And when it started rejecting that, adding the markov chains as initialisms was no walk in the park, you have to let your recipient know how to get it back out again (basically a perl one-liner). But at least there's lots of Gutenberg ebooks to train on and really it's just not that big of a deal. Anyone can do it in an hour or two in pretty much any language, just follow tutorials. Then just tell your recipients what to do and they'll jump through whatever hoops you want. I mean how else am I going to send a 700 kilobyte wav file? FedEx a custom-pressed CD under my own label?
I tried to send a zipped archive of some code on Google the other week (old project), it rejected it, so I zipped it again with a password, it still rejected it.
In the end I had to throw it up on the web with a strong password.
Not sure if they the prior attempt unpassworded set a flat for the subsequent attempt but Google though is almost superhuman when it comes to blocking malicious stuff I do wonder what the false positive rate is like.
I wouldn't blame them because there is so much "easy" bad things people do except that all the ad networks apparently allow an advertiser to run whatever JavaScript they want in a third party website. I would be in their side if they disallowed JavaScript in ads.
Oh no, I've encountered this many times already. Sometimes, double-wrapping the archive satisfies a system that isn't recursively extracting archives. :/
Apple mail has this really cool feature that when you email someone a file that is too large, it uploads it to Icloud automatically and then sends that a person a downloadable link from Icloud that expires in 30 days
Thunderbird has/had this as well, you could even select a few providers, like Box and that Ubuntu cloud which is now deprecated. Probably a decent solution, but still needs a 3rd party server, which you didn't with the p2p im solutions.
I use it regularly and have self hosted webdav server for this reason. iirc, extension for custom server is required, but the functionality comes from Firefox.
It doesn't matter if the receiver uses Apple products or not—so long as the sender has an iCloud account, it'll work. As far as the receiver cares, it's just a URL where the file is.
I remember all the disconnects with ICQ file transfers. I think they added a recovery mechanism eventually.
Some friends even ran their own ftp (I think most people used War ftp) before ISPs started blocking ports.
Now it's all about uploading to "the cloud" or use sneakernet. My friend uses mega.nz, which seems decent for file sharing. I, personally, never underestimate the bandwidth of a sedan, driving down the highway with a single USB key.
Skype somehow gets worse with every update. I have given up on it all together, now I just use appear.in instead. It doesn't require signing in and supports screen-sharing
I'm lucky I set up call forwarding to my google number a while ago. I can't even receive calls anymore! If the skype gods are smiling, one of my associated phones will randomly ring if somebody tries to contact me on skype (Google voice I think tries to ring all your phones in hopes that you'll pick up one of them, but in practice it can be somewhat random what actually connects).
Sometimes it doesn't though, and then I have no idea you tried to contact me, because skype is so broken that it /never/ rings when you call me.
This makes matters worse when my skype number is listed as the contact number on some contracts I've signed (their computer system was set up to only accept Japanese phone numbers).
Don't know if you're serious or not, but I'm pretty sure M$ stores the files if you manage to send anything, which, with the linux "client" is a struggle for certain.
One thing I used to do when I wanted to send files to people outside of the context of a chat was just drop them in a directory on the web server that was running on my machine. Sharing files with folks in a private IRC channel was incredibly easy this way.
It will come back. When C++ becomes " retro cool", the current generation of JS rockstar ninjas will shout it from the rooftops: "I've reimplemented Atom in C++ and it's SO quick, man!"
I can remember the number, but I've forgotten the password :-) I tried to get them to reset it a while back but it's tied to a long dead email address so I've no way to verify it's me and they won't help.
When I tried to reset my ICQ password they wanted me to message 5 friends and have them vouch for me, people with whom I haven't talked in 15 years.. otherwise they wouldn't unlock my account which is coincidentally, for some unapparent reason, in some sort of restricted mode where I can't send any messages. Support would not budge and unlock my account even after I provided ample proof.
Why is it more difficult to get back in to my 20 year old ICQ than it is to reset any other password I have, even "high security" sites like Bitcoin exchanges or Domain Name providers.
Same problem, and I had the password, but hadn't used it in forever... shame, 6-digit account iirc. But, realistically, haven't touched ICQ since around 2000 or so. I do wish a third party messenger not tied to a major social media chain could gain a hold again. Something that only acted as an exchange/email validation system... Actually, keybase could be a nice central verification tool for a distributed p2p client.
Our tool lets you use the same P2P method for Remote Desktop access and link sharing (mblok.io). I saw a lot of these tools come up when WebRTC started out, however, they are limited by file size and are really unstable.
I dread using Skype on my Windows 10 laptop. The application crashes half the time, logs me out for no reason, and takes a ridiculous amount of time to load up.
I haven't seen a app more ignored than Skype. This should have been what Whatsapp is today. But MS just killed it
I think I'm going to have to settle for this or similar for non-phones (tablets, desktops). I kept looking for a simple messenging app (username, password... maybe email account) to avoid having to share my number, etc. The first thing Telegram did (like many others) was pull my contacts and display other users in my contacts (based on their number) that were also using telegram.
There just doesn't seem to be anything out there that's friendly, old-school messenging (except for complicated stuff that would be difficult for other users to get setup).
There is a whole market, managed file transfer, only focused on this. The industry is/was strongly oriented on integrating this offerings inside ubiquitous apps like Outlook. UX is the main issue.
Neat. It uses client-side crypto (AES-128-GCM) to secure the file; the key is in the fragment portion of the URL so it doesn't automatically hit the server (assuming you trust the server JS).
The protocol is a little bit strange, though. The file metadata is transmitted as an X-File-Metadata header on upload, and includes the SHA256 hash of the original (unencrypted) file (as the "aad" parameter to the X-File-Metadata upload parameter). This is a little concerning for privacy; while the filename is easy to disguise, hiding the SHA256 sum requires modifying the file in some way. Of course, this might only be a concern for uploading known files, but it's still a bit of an infoleak.
It's also strange in that the key isn't checked in any way (even for sanity) before initiating a download, so if you mess up and leave it off (or corrupt some bits), you won't find out until the end of the download that you can't get the file. Worse, the file will be deleted, forcing you to ask your sender for another copy.
The client-side crypto has one other downside: there doesn't seem to be a standard way in JavaScript to stream a POST request yet. You could emulate it with e.g. WebSockets, but those are a lot more heavyweight and CPU-intensive (for the server) than simple POST requests. So, the current implementation just encrypts the entire file as one giant block, and then uploads it - placing the whole file in memory. Hence the 1GB soft-limit. Downloads are similarly limited.
Luckily, non-browser clients can do whatever they like, so I wrote a Python client that's compatible with the server, but uses streaming POST and on-the-fly en/decryption to save memory. Check it out at https://github.com/nneonneo/ffsend - feedback welcome!
> This is a little concerning for privacy; while the filename is easy to disguise, hiding the SHA256 sum requires modifying the file in some way. Of course, this might only be a concern for uploading known files, but it's still a bit of an infoleak.
Does it matter? The file is behind a URL with a random id (and the hash expires from redis after a day). Even if someone guessed your id within a day, they know essentially nothing about your file or you. And if they had your URL, they could download the file anyway, making it moot.
> Hence the 1GB soft-limit.
Mozilla stores the files on S3. That needs a reasonable limit.
Mozilla, on the other hand, knows your SHA-256 and your IP. So if you're uploading some known offensive file, the logs could be subpoenaed, etc. etc. Normally it's just preferable to have the service provider know as little as possible.
With the advent of smartphones, most people are running a computer 24/7. But that computer is still either behind a NAT or on a connection where data is precious.
NAT punching is a thing, but it makes the implementation of p2p a lot more complicated.
UPnP was supposed to help with this as well before it became a security disaster. There's also stuff like https://github.com/danoctavian/bluntly/blob/master/README.md to do NAT holepunching without a central server (using DHT) but again adoption and the actual ergonomics of usage (npm, the config file, key distribution etc make it fail the "could my grandma use it" test) are not easy enough to make it easy enough for the un-devops'd masses.
Some crypto currency based on file backup (say, Sia or Diskcoin) could completely solve this problem since you could just drop a magnet link to some encrypted files to your friend and your file stays accessible for as long as your funds cover it.
> If all computers were publicly reachable it would be trivial to send files peer-to-peer.
WebRTC exists today, and it's quite good. It's not a technology problem, it's a matter of practicality. Mobile devices being reachable over the network 24/7 is just not realistic (connectivity falls off, battery considerations, etc.). I don't want my phone to heat up and come to a crawl because the video I just shared is being downloaded by three friends over LTE while I ride the train.
I think it is more an IPv4 problem -- everyone is fire-walled and NAT'd up the wazoo and direct connection require some UDP + voodoo (or at least it did last time I investigated this).
a UI problem and a legal problem. It's difficult to create a service that is accessible to people who barely understand what a file is, and won't be taken over by pirates and then shut down by law enforcement.
The fact that files can only be downloaded once from this new service isn't just a coincidence.
I agree that it's a sort of UI problem. I could be sharing a desktop with a customer with Teamviewer (which maybe has file sharing, and it's telling I'm unsure about it) and at the same time talking with him on Skype, which for voice is still good (it's bad for everything else - text, files, screen sharing - video calls are of very limited use.) But I can't easily send a file to the other guy.
Unfortunately AFAIK there has been no successful open point to point file transfer protocol that different OSes could implement and be interoperable over the Internet. Going to https://send.firefox.com/ and drop a file there is not an improvement. It's still centralized. Is there any solution to the problem of discovering the address of the sender and the recipient without a central server? I would think it's an impossible problem but there are clever people out there. Maybe mesh networks?
Indeed, I remember when a "pair of netcats" was the preferred file transfer solution, and even for IM "meetings". Almost no one knows what I'm talking about if I mention it to most people now, even those otherwise experienced in computer use and/or are developers themselves.
On a similar token, there still isn't any good attempt at a user-friendly E2E-encrypted email solution that integrates with what people already use (minus a few PGP-based browser extensions which don't have much adoption yet).
If we still paid for software like we did 15 years ago, we'd probably have a perfect solution by now. Instead we have a bunch of "free" options that all want to harvest our data.
Email attachments are the lowest common denominator despite the 25MB cap, occasional diversion into a spam folder, no sender authentication, no encryption, poor suitability of email clients and servers for bulk file transfers, etc. Whatever can hope to replace it must be as popular (network effect), as easy to use, and free. Sharing very large files (hundreds of MB to multi-GB) is too costly for a free service. People will not setup or maintain a server, an always on PC is less reliable and has a slower uplink than data centers, and smartphones have limited data quotas, storage and batteries. Slow IPv6 deployment, middleboxes (firewalls, home CPEs) makes efficient p2p connections difficult.
If it's free there's little incentive for a company to spend the money developing the easy UI and marketing it to critical mass. Even simple to use Dropbox has not made a significant dent. BitTorrent never became mainstream at the level of email but it did incentivize people to seed with the prospect of pirated content. The MPAA/RIAA have deputized any websites that accepts user content as copyright police and demonize non-enterprise file hosting services.
In a way the success of Facebook is a reflection of the spammy, phishing/malware filled, and slow nature of email and difficulty making your own website; if email continued to improve there wouldn't be demand for a modern solution. File syncing and sharing should be integrated into the operating system. However Microsoft's OneDrive and Apple's iCloud are inferior to Dropbox so a third party solution is still relevant.
Well, not all the time. Never works at my parents, for example. Works in my flat about 80% of the time (although iOS 11 + High Sierra seem to be improving that ratio slightly). Never worked at $JOB-2. Doesn't work if you want to send to a non-Apple device. etc. Maybe Apple will spin up something out of the DeskConnect ashes for these cases.
I am currently enjoying* the irony that Airdrop works better between my phone (currently on the work guest wifi network) and the work laptop (wifi turned off, ethernet on work corporate network) than in my flat (where everything is on the same wifi + network).
For buisness file sharing it has been solved dozens of times. The problem is that the solution tends to turn file sharing into a managment problem. You end up with file structures, naming conventions, permissions. Yuck! It is vital to have a way of sharing large files that just requires a social interaction between two people and always just works.
Entreprise ans consumer solutions are necessarily different because of the legal environment. You don't want random employees sending arbitrary files to each other with no auditing, security, organization, etc.
I get where you are coming from. But a lot of employees have the authority to decide this kind of thing for themselves. Making them subject to technical restrictions can make it harder to make progress on projects. It's like telling people not to give each other pieces of paper or talk outside of official channels. That might make sense at Nasa mission control (for example), but it does not where I work.
Although you do have to make sure you upgrade all your SyncThing clients in lockstep - which is a faff when you've got a mix of OSs to deal with - otherwise you'll get protocol mismatches and nothing works right.
Thanks thats useful to know. I am using it in a fairly restrictive corporate environment where the Github download links are blocked. Makes updating a difficulty.
Perhaps. Unusable on anything that's not Chrome, dark ad patterns galore, and if you let the link expire, it's game over. As for me, I would use the words "almost bearable."
Your browser is not supported.
Unfortunately this browser does not support the web technology that powers Firefox Send. You’ll need to try another browser. We recommend Firefox!
It would be nice to know what web tech they are using that isn't supported. Whatever it is, Chrome works.
EDIT: It requires support for the AES-GCM key type, with a size of 128.
From user eridius a couple days ago: "It's checking for window.crypto.subtle. Looks like Safari TP supports this. I believe the problem with Safari 10 is that it implemented an older version of the web cryptography standard." https://news.ycombinator.com/item?id=14904307
Safari 10 supports an older version of Web Cryptography. 2 weeks ago on the WebKit Blog there was a post about the current state of Web Cryptography and the differences between the older version (that Safari 10 supports) and the new version (that Safari Technical Preview supports). And it says that the older Web Cryptography only supported AES-CBC but the new one supports AES-GCM.
Otherwise technically illiterate people used to be able to do this with AIM direct connect over 15 years ago. It still blows my mind that AOL had a near monopoly in this space, and lost it by continually making the user experience worse.
I worked at AOL for a year. They have a way of taking a great idea/product and completely destroying it with idiotic features that nobody wants (but some exec thinks would be cool). It's probably catalyzed by their unfathomably stupid "matrix management" structure which optimizes for management bloat, confusion, and mediocrity.
Sorry, I missed your comment. If you've seen Office Space, comically it's kind of like that. I had four different direct bosses, each who were supposed to manage different aspects of what I did. What ended up happening most of the time was I had a "real" boss who would direct myself and a few other developers, and then we'd have random drop-ins of managers who had no clue what was going on and trying to change things to meet some ridiculous quota they had.
Really the whole thing is set up to limit manager accountability (since managers can just point the finger at each other when something goes wrong) and increase worker frustration...it's not uncommon for two managers to tell you to do two different things.
I always wonder how Opera Unite (in 12.x) versions would have fared had it gained traction. The sender had absolute control over what files were shared and how long they could be without needing to rely on a 3rd party to host content or setting up a complex service on localhost. Opera did kill it off the Unite service even before they migrated to Webkit/Blink, but it is something I remember fondly.
That sounds amazing. I really hope we see this in another browser some day.
I wrote the original TCPSocket implementation for Firefox OS. As I was doing it I imagined an architecture like what you described. I know at least some prototype apps which worked in a similar way were developed.
One issue with the experiment is it has such a narrow use case. Disappearing after one download / 24hrs makes sending a file to multiple people--or just one person who drags their feet on the DL-- makes it really inconvenient to use. Even offering "1 download -OR- 24hrs" would make it far more useful.
They're trying to avoid running the next Mega. It also posts the unencrypted hash of the file you uploaded to their server so that they can do DMCA takedowns.
There are other existing solutions like https://share.riseup.net which do not do things like this too and which do allow multiple downloads.
I assumed zipping to be determinative and that a standard gzip would lead to a "known" hash, ergo the idea to create a unknown zip file by adding some bytes.
It would be determinative only if all the header fields[1] stayed identical. A change in something like file name or mod time, would change the hash of the zipped archive without affecting the hash of the actual file.
I don't think you can expect the file mod time to be identical, as it isn't preserved through many forms of file distribution.
>It also posts the unencrypted hash of the file you uploaded to their server so that they can do DMCA takedowns
Why would they do that? It's not like anyone can send them DMCA notices: the act of verifying that the upload violates your copyright already removes the file from the service. And I don't see anything in the DMCA that requires them to search for offenders, they just have to act once they become aware?
> And I don't see anything in the DMCA that requires them to search for offenders, they just have to act once they become aware?
The Megaupload case sadly is a good example of what no one wants to repeat, as there this was exactly argued - that, once notified, the hoster has to ensure this file won't ever be uploaded again, and all other copies are removed, too.
Hmm... it seems like most of the time when I want to transfer a large file to someone (or to another one of my own devices), I just want to do it immediately and only once, so there's no need to upload it to a third, temporary location.
Unfortunately it seems like most of the time a physical USB flash drive is the most efficient way to accomplish this. Seems absurd to me that in 2017 there's not a common, user-friendly way to just establish a direct connection between two web browsers and directly push files through.
They were never inteded to be active senders or p2p workers. I know, we've come a long way in the past 2 decades, but I never expected this to be the job of the browser. I'm also aware of webRTC and I'm a bit uncertain why that can't be used to send/receive files.
There are a bunch of WebRTC file transfer web apps but they unfortunately all suffer from the legacy cruft that WebRTC brought along. The only data channels are UDP and things like STUN / TURN / ICE are needed to have any hope of breaking through NAT, often with disappointing results.
Ya, for Dat[1], we had a WebRTC implementation and it was unreliable compared to our other clients. We are hoping other in browser p2p options get developed, e.g. https://github.com/noffle/web-udp.
> Hmm... it seems like most of the time when I want to transfer a large file to someone (or to another one of my own devices), I just want to do it immediately and only once, so there's no need to upload it to a third, temporary location.
When it works, AirDrop between Apple devices seems like it does exactly what you describe. Of course, this assumes that the two devices can see each other via Bluetooth and I'm pretty sure the thing is partly mediated by iCloud.
I'm not sure if there's an equivalent in the Windows world.
> ...establish a direct connection between two web browsers and directly push files through
Is this really a web browser's job? Why does it feel like the default answer to every "Why can't I do X easily?" question is to make the browser do it?
I used to transfer stuff between different Windows machines (on the same network) by simply going Start, Run, \\OtherComputerName and authenticating. Regular users, like my parents, would just go to Network Neighbourhood and find the computer in question. It was pretty easy and all built in to the OS.
Shameless plug: I created an "airdrop" that works on all major platforms. Device directly to device. No servers. No "cloud". Local file transfers only. http://feem.io
If you're in the same physical space / layer 2 network, many things are Simple and Easy (and Efficient, indeed). For sending to people that are 100 ms out there, not so much.
A really needed service, but I doubt it will last for long, because it's far from the core business and because it will potentially cost more than Mozilla is willing to dedicate.
I did a quick scan of the article but is there any difference with wetranfser? The only things I found is encryption and it is 1gb less. But since Wetransfer is a dutch company they are not allowed by law to look in those files you send if I am correct.
I really don't get why people are criticizing and saying that there are better alternatives to this. Of course there is. This was not built to be the best way to send files, just to be the most practical one. Some people don't even know there is life outside of Facebook, they will never know about alternatives to send a file they could not send using email or messager. And this shows that Mozilla is starting building services layers on Firefox.
It would be nice to be able to self-host this on a small home server for friends and family. That way, even if they shut down their server, you could still share files with your friends.
What is wrong with having an http file host on your local pc? Anyone can browse an http address, and you can get a subdomain for free. Get a free https cert from letsencrypt and possibly put http user accounts / passwords on it to restrict access.
The problem with all these P2P approaches is that they work sometimes, or barely. Aforementioned file.pizza can't even send me a file between my laptop and desktop, both being behind the same NAT.
EDIT: Well what do I know, it worked today, but hasn't a few weeks ago when I first heard of it. The transfer speed however makes it seem like it's actually pushing it through the entire internet, it's flying at about 500kB/s.
YES, this is what I was hoping send was, another implementation of webrtc p2p file sharing.
Unfortunately it is not.
My only issue with filepizza is, I wanted to host one internally, but it has a hard dependency on a hardcoded list of torrent trackers. (I have a DENY ALL firewall rule at the border which can't be touched) :(
FilePizza creator here! I hear you—the dependency on the broadcast servers has been giving me issues. I'm planning on going back to a home-grown file transfer method over simple WebRTC, rather than using WebTorrent. Hopefully will get a fix out in the next couple days.
We do NAT traversal but also connect to local peers over multicast DNS. Command line client should have a local or offline option to restrict only to local peers soon.
"Better" is incorrect and presumptive. I'm frequently bound by idiotic networks that prohibit P2P in every possible form they can obstruct, including (through no fault of its own) WebRTC.
I find "burn after reading" downloads for a number of reasons, but generally, they often don't work as intended.
For example, modern email services (Google, MS, etc.) accessing links in emails and download the content and check it for malware. They probably mitigated this but its caveats like this that cause messages to be burned before the intended reading.
True! Also like one commenter said on the original article, if I have to share a file with 3 users, I will have to upload it 3 times and also there's the case of failed downloads.
Why are jumping from imagining a specific use case (1:N file share) failing (with little evidence that the product doesn't/can never recover from failures seamlessly) to calling into question the entire product?
According to yesterday's discussion they do have problems with handling failed downloads, they might want to fix that.
But if they would handle 1:N file sharing, they would risk being attractive to a lot of users they don't want (mostly those distributing in a copyright-infringing way)
Since downloading the file has side effects, I assume Mozilla would require the user to push a button that initiates a POST request. And I don't think any modern email service would initiate a POST request, because that would be a massive vulnerability in that service.
How is Mozilla going to keep this viable? Since they're using S3, it likely costs them roughly $.08/GB moved between users in bandwidth costs plus whatever fraction of a month the file is left there of the $.025 GB/month storage costs.
Personally, I don't think that they'll keep it. It is specifically in the Test Pilot program which is for trying out and seeing the response for things, even if they are not necessarily realistically going to be put into the browser.
I would expect they move to some other provider if they keep this running, considering AWS is among the most expensive solutions in terms of bandwith costs.
the file is deleted after one download or 24h, which limits the damage. Of course it's also open source so you can host your own- which I think quite a number of people will do.
It need not be communicated over the phone. Say you want to move a file from desktop to your mobile. Or transfer quickly from your iOS phone to your friends Android one. This will help in those cases but is limited by the randomly generated url which you will have to pain stackingly type. I don't see how a sufficiently long randomly generated string with memorable words (/jack/never/securely/farted/sky/fall/what/ever) is less secure than randomly generated token like this.
Why will you have to type the URL? You just send the URL to the person over any communication medium. For your own devices, you even have apps that provide shared clipboard so you just copy on first device and paste on another.
Not OP, but I've occaisionally found that it's very hard to send messages between my own machines, because most platforms are based on users - not machines.
I can message my wife from my phone to her phone, but I can't message myself from my phone to my PC. At least, with most messaging programs.
I suspect they have plans to make this into some form of browser feature, if it's successful and they make good experiences. It is a task people normally use websites for, and in the past that was enough justification for Mozilla to turn something into a browser feature.
My personal preference are browser agnostic methods[1] and giving the sender the choice to use whatever method of encryption they wish. I prefer the simplicity of 7-zip / p7zip, but others may prefer PGP.
I use https://transfer.sh/ for this kind of ephemeral file transfer. They have drag/drop through website, integration with ShareX, and even an alias that you can add to your shell.
DCC Send is an excellent solution (although quite crappy UX until you get used to it). It does, however, suffer from a couple of limitations which make it inviable in 2017.
1) It can't traverse NAT, you have to forward the ports on your router- which is quite frustrating.
2) It can't use ipv6, which would have eliminated the first problem, but unfortunately without ipv6 it can't do that.
Unless of course you have a dedicated IPv4 for your DCC sender and receiver, but I think that is improbable.
Yesterday I wanted to quickly send over some large files from my debian server to an ancient windows one on my home network. The server is running on ZFS, so in theory, install samba, set usershares on, zfs set smbshare=on rpool/x.
I ended up copying over on an SD card after ~1 hour of fighting with smb version compatibility, smb vs linux permissions, workgroup mismatches due to localised windows.
I find HTTP generally much more reliable. Normally I use woof (just run "woof <file>" to start an HTTP server serving that file), but I'd like to find something better, since it doesn't handle multiple connections well.
woof also supports uploads (shows a basic uploading page), which is nice when you want to transfer to a server.
Ha, I ssh-ed a box and wanted to download a file ... several stackexchange threads later and I ignored ssh/rcp and other suggestions and just used the fish: protocol in dolphin (on KDE) and drag-dropped the file. I'm sure it took way longer than it should but the other methods seemed way more complex, I was expecting a ftp-like copy command to be available.
Either way looks like a good promotion trick for Firefox if many people end-up using. Good job whoever came up with it and convinced Mozilla leadership to deploy it.
When it first came out, it "just worked". The simplicity of the folder=key was great. If you have the long key passphrase, you are in.
Then they implemented a whole bunch of features that I thought were clunky, emailing the passphrase, QR codes I think, the ability to require approval to access shares, sub-permissions, etc.
2 years ago or so they released a version that was not backward compatible, so everyone had to upgrade or not, and if only some peers had upgraded, things got out of "sync" heh.
Finally, we got my team all on resilio sync and we all had 100% CPU pegs on certain shared folders.
I gave up on it.
When it was really simply and it synced it was great
If this picks up, email services (who own cloud sharing facilities) might put up a huge warning in red stating the security risk for their users in clicking that link. I wonder whether Mozilla feels having the file scanned by virustotal before encrypting violate user privacy.
So what is this going to cost the user in the future? I can't imagine this will be a free service forever if Mozilla has nothing to gain from eating up tons of bandwidth.
At $10 per TB of network data transfer, with the guarantee of a single inbound and outbound copy of the data file, it's more likely that some jerk will try to DDoS the service to make themselves feel puissant than it is that this will cost a significant amount of money as a TestPilot experiment.
TBH for most people that use WeTransfer I don't think this is a major concern, though it maybe should be.
This smacks of Mozilla misjudging the market again and wasting a lot of time and money on things that just won't work at scale. Annoying because they are by far the richest 'tech charity' and have a lot of really amazing engineers. More Rust and Servo (concepts that very few people can execute on), less random startup ideas, I think.
> This smacks of Mozilla misjudging the market again and wasting a lot of time and money on things that just won't work at scale.
It's an experiment, part of the Test Pilot program. A relatively low-budget MVP to test the waters and see if the concept is worth exploring. If not, the experiment ends and the organization moves on.
The program exists precisely to counter your concerns here while ensuring Mozilla can continue to innovate in new areas.
I assume it is a bit difficult though, since the encryption is performed in the browser. Thus API clients had to perform the exact same crypto to use this service.
I just tried it and looks like the GFW doesn't block it yet. Upload speed was surprisingly good too, much better than I ever get downloading foreign websites.
Good point. BUT, comparing it to snapchat implies a tonne of other things that this isn't (potential for millions of users, teenage users targeted, super cool interface that no one can use except teenagers, ...). This is a clickbait title and shouldn't be encouraged
I don't really see the point. We have had temperately file hosting services for years. Moreover I find the fact that it requires JS and multiple 3rd party resources in order to work properly extremely annoying (all the other services that I know of do not require that).
I think that it would be better if Mozilla focused more on their important projects, such as Firefox, Servo and Rust.
Doing a quick google, the other temporary file hosting services have much smaller limits. Firefox has a better reputation than these others (for confidentiality, etc)
Seems superfluous. In Windows 10 you can right click on a file > Share > choose any app or person to send the file. Or in the latest version you can just drag/drop a file on top of a person pinned to your taskbar.
The only items I have in that sharing menu are e-mail, which clearly there would be no need for this service if that were good enough, OneNote, which makes no sense, and advertisements for Store apps that I don't want to use. I think it's safe enough to say that menu does not replace a service like this.
The problem with that is Windows' lack of apps. The UX is better and will be consistent regardless of if you're sharing from an app, File Explorer, or file picker.
I thought this was new Firefox UI at first. This seems like a service that would be nice to integrate into OS share flows. Windows, Android, and iOS all have share contracts and that seems more natural than going to a website.
I don't know for certain without digging into the code but they are probably using the WebCryptoAPI and doing everything client-side to encrypt the file.
The URL that is shared contains the key for the file. You'll notice that the URL contains a fragment identifier, i.e the #foo part of http://example.com/#foo, this isn't transmitted to the server by the browser and therefore the key isn't exposed beyond who the URL is shared to.
Yes, since they could change the JS without notice from to do something different, and could conceivably be ordered by a government to do so generally or targeting a specific set of users.
Data after the # in the url should not be sent to the http server by the client. Encryption/decryption is presumably handled in the users browser by JavaScript.
The statement about not having the ability to access the contents of the files is perhaps somewhat misleading as they do control the JavaScript that either creates the key or will be given access to the key when someone retrieves the file (by reading it off the end of the url).
When inevitably someone copy-pastes the url in to Google search will Google visiting the URL then cause the file to be deleted before the intended recipient can download it?
Are there other ISP based systems, say, that perhaps sample the head of a file for anti-malware purposes that might do the same?
Mozilla also say that your Sync passwords are secure, but they aren't — they are secured with one's account password, which is processed by JavaScript downloaded from Mozilla. At any time they can target — or be compelled to target — a user with malicious JavaScript which sends his password (and hence access to all his 'secured' data) to Mozilla or any other organisation.
Hello, are you in need of hacking services? Then contact nightwatch366@gmail.com he is the best hacker. He helped me and my friends with some issues we had...i contacted him after reading someone testimony on here..... If you need to
hack into email accounts,
all social media accounts,
school database to clear or change grades,
Retrieval of lost file/documents
DUIs
company records and systems,
*bank accounts,
he is really the best. His services are affordable. Don't waste your time with fake hackers
+ Credit cards hacker
+ We can drop money into bank accounts.
+ credit score hack
+ blank credit card sale
+ Hack and use Credit Card to shop online
+ Monitor any phone and email address
+ Tap into anybody's call and monitor their
conversation
CONTACT: nightwatch366@gmail.com
Joke aside I transfered a lot of files inside instant messengers and they worked quite well. Nearly everyone had at least a yahoo/messenger/skype/icq account, which made this rather simple, and, because nobody had the capacity/wasn't insterested/was actually p2p, it was perfectly fine. A bummer if the modem connection went down or you had to hang up because the family wanted to make a call, but hey, it was glorious. (no, this is not sarcasm, it really did work.)