> Javascript only has so much room for the bad to hide in--it's a little quirky, but isn't half as open to WTFs as, say, C++ (any generation thereof).
You know, I only know JavaScript about a fraction as well as I know C++, and I can say with a great deal of confidence here... no.
C++ has lots of WTF's in strange little corner cases (mostly around the C compatibility issue). C++11 has done a good job of cutting down WTF's. In JavaScript you needn't go to the corner cases... it's right there in the normal cases.
We're talking about a language that still doesn't have a standard way of reading and writing bytes... for talking to hardware. Think about that for a moment.
> Don't be hating just because these new kids don't have to bang their faces into machine code or shitty C macros--it's great to have a way of trying different programming techniques for embedded systems.
Honestly, I'd not have blinked if the idea was to use Java (even with its lack of unsigned arithmetic), C#, Objective-C, Lua, or even say Go to do the job. I mean, you need to use some kind of a language and you want it to be something that a lot of programmers will find accessible.
But picking JavaScript for this job is kind of like picking Forth to build the new, more accessible database query language...
"We're talking about a language that still doesn't have a standard way of reading and writing bytes... for talking to hardware. Think about that for a moment."
Wrong. Javascript has typed arrays. When's the last time you actually looked at Javascript? 1998?
"Honestly, I'd not have blinked if the idea was to use Java (even with its lack of unsigned arithmetic), C#, Objective-C, Lua, or even say Go to do the job."
They are using lua. They just have a front end to lua that uses javascript syntax, it would seem. Because languages without curly braces are scary, and the goal is to make this stuff accessible.
To the expert, making your niche accessible to plebes is scary. I understand.
As recently as a few months ago I had to base64 encode a binary protocol because the JavaScript guys couldn't handle decoding the raw bytes.
> They are using lua.
Yes, but it seems like lua is merely used as a target rather than a programming language. What language people actually code in is highly relevant if your goal is to make a domain more accessible.
> They just have a front end to lua that uses javascript syntax, it would seem. Because languages without curly braces are scary, and the goal is to make this stuff accessible.
There are a lot of languages with curly braces that aren't half as scary as JavaScript.
> To the expert, making your niche accessible to plebes is scary. I understand.
I appreciate your "understanding". Next time you are trying to make the big "I can read your mind" power play, you might want to read the context a bit more carefully.
I don't generally do programming that talks directly to hardware, so even by your rationale I have no reason to be threatened, and I'm all for making this niche more accessible. I've seen excellent jobs of making hardware accessible to novice programmers using Java, Smalltalk, and even things like SCRATCH and Alice. I think that stuff is great. As you put it, "making a niche accessible to the plebes" is a great endeavour and what I have spent most of my career striving for. I've learned a thing or two about that path along the way. This isn't the first time a choice like this has been made. I thought I'd share that this particular choice, if anything, undermines the goal.
"I thought I'd share that this particular choice, if anything, undermines the goal."
but that isn't what you shared. All you had to share was vague old man gripes about a language, with the only concrete example being something that is only really half true.
For christ's sake, you suggested Java and Objective-C for the task. How do you expect to have any credibility after that? For a novice language, why are you expecting a typical task would be decoding a binary protocol?
To claim that you are just sharing your wisdom here is intellectually dishonest.
> For christ's sake, you suggested Java and Objective-C for the task. How do you expect to have any credibility after that?
Because while they might not be considered "hip" languages, they have large developer communities and are comparatively straightforward to pick up for any web developer that isn't familiar with them, yet they still have good support for the task at hand: talking to hardware.
> For a novice language, why are you expecting a typical task would be decoding a binary protocol?
Because they're talking to hardware, and because working with binary is actually much simpler (if your language doesn't have some bizarre aversion to it).
If you've ever taught novices to program, it's actually easier to start with binary as you avoid all the complexity of text encodings and parsing. You want an integer? Read an integer. No length prefixed fields. No reserved characters. No escape sequences. No terminating characters. No character set encodings. No case sensitivity or symbolic equivalents to worry about. The worst you might have to deal with is endianess and usually you can dodge that issue by starting with native endianess. C makes it harder with it's "not sure what width that is" fixed point integers, but that's part of why you don't pick C for that job either.
Working with binary only really becomes a pain once you have to start talking to humans.
> comparatively straightforward to pick up for any web developer that isn't familiar with them
compared to javascript.
really?
you think java is "easy to pick up" ?
I'm going to have to just say I disagree with you there. I don't like my chances of convincing you how silly that sounds. You seem to be operating on some very peculiar assumptions.
> compared to javascript. really? you think java is "easy to pick up" ?
Yes, compared to JavaScript, it absolutely is.
Really, the only PITA with learning Java is all the framework-itis (which is starting to become an issue with JavaScript as well, but that's another story...). For things like embedded systems, Java doesn't have any of that stuff and it magically returns to the comparatively simple language for programming network aware hardware it was originally intended to be.
> You seem to be operating on some very peculiar assumptions.
My assumptions are from teaching and watching other teach both children and adults how to program (usually with adults it has been people who are already professional programmers). I'd absolutely agree that Java isn't the best teaching language, but particularly if you've got someone who already knows how to program (which is kind of what I think of when I hear "web developer") at least at some basic level, it tends to be pretty easy to pick up.
JavaScript on the other hand, is quite the opposite.
I've worked with "copy-and-paste" developers who couldn't really write their own program from scratch to save their lives. When you asked them to explain what the code they'd pasted in was doing, JavaScript was invariably the language where they had the hardest time deciphering what was going on, and more often than not, you ended up sympathizing with their difficulty. (C++ and Perl can also be quite difficult to decipher, but they tend to have the advantage that anything much more complex than a one liner generally won't work at all if just blindly copied and pasted, so copy-and-paste coders tend to be rarer breeds there).
> Anybody can open up a browser and immediately start playing around with Javascript.
Tell me, if you were looking to teach people to write Java code for web applications (I know, perish the thought!), would you start them off writing applets?
> You simply can't say the same for Java, C, C++, or assembly--frameworks and the other nonsense notwithstanding.
Your statement about the ubiquity of JavaScript runtimes is really JavaScript's strongest feature IMHO, but I don't think it is terribly compelling in terms of developer accessibility. You could make something like that argument could say something pretty similar in support of DOS shell, PowerShell, AppleScript, VBScript, VSMacros, XSL, but no one would ever think that a defensible argument for them. Sure you want accessibility, but I think it's okay to suggest that a one click install not be outside the grasp or patience of a developer on their way to writing for distributed device systems... ;-) The difference between "open your browser and now start to learn this language" and "open your browser to download and install this app so you can start learning this language" shouldn't separate anyone who was going to make it in the first place.
Well he can make his point any day now. Still waiting. For now all I have is that he doesn't like javascript, because it can't do binary. (although it can. so the conflict should be resolved right? Did he have another point?)
He doesn't care, probably. You're just some kid who can't/won't program in anything besides javascript (of all languages). Nobody owes you an explanation of why that's limiting. I'm surprised he indulged you for as long as he did.
For someone who has no clue what they are talking about, you are remarkably dismissive.
I get the point that he doesn't like javascript. Why should anyone care about his personal preferences?
I code in lots of languages besides javascript. I am not insisting on anything. If it were my choice I would have gone with lua, or some dialect of logo or lisp. Or even haskell. it is him insisting that javascript should not be used, with the only justification being something that is not true. Having thoroughly debunked that one piece of evidence he had, what's left? Why should anyone take his opinion seriously?
Resorting to calling me "some kid" is pretty classic though. Did you run out of real things to say, and decided to resort to just dismissing me with ad hominem?
> For someone who has no clue what they are talking about, you are remarkably dismissive.
You keep using that word. I do not think it means what you think it means.
> I get the point that he doesn't like javascript.
Actually, that wasn't one of my points.
> Why should anyone care about his personal preferences?
I have no idea.
> it is him insisting that javascript should not be used
Read again. I did not insist that JavaScript should not be used. I said I did not feel it was a good choice. As you pointed out, and I pointed to some obvious signs of why it might not be. My opinion being what it is, that is no basis for insisting upon anything, but even if my opinion was law, the fact that something might not be a good choice is no reason to insist it not be done.
> Having thoroughly debunked that one piece of evidence he had, what's left?
You keep using that word. I do not think it means what you think it means.
> Resorting to calling me "some kid" is pretty classic though. Did you run out of real things to say, and decided to resort to just dismissing me with ad hominem?
Hey, one more dismissal! I think we have a record.
Seriously? You're going there? Did you forget that you bolstered your argument with "old man" just yesterday? One might be concerned about projecting so much...
sorry about "old man".
What i wrote was "old man gripes", which was intended at the "I'm afraid, uncertain, and I doubt this technology", not at you as a person.
Poorly chosen words. Mea culpa.
"Some kid" clearly was directed at me as a person. so..yeah. I'm going there.
No worries. I didn't take it personally, but it was kind of a "what kind of argument is that?" moment.
I generally don't think of FUD as an "old man" argument (more like a "man keeping you down" kind of thing ;-), and not really very applicable to a venerable and pervasive language like JavaScript, though I guess Node is newish (and ironically the JavaScript environment I'm most familiar with).
> "Some kid" clearly was directed at me as a person.
I think if you look carefully at the thread, you'll see that the "old man gripes" comment unfortunately presented you as a petulant child and provoked an anti-ageism response from jbooth, which ultimately ended in the "some kid" comment.
I don't think jbooth was trying to make a technical argument at that point; it was indeed personal, and an empty ad hominem, but you unintentionally opened the door for that. Something to keep in mind when processing his comments.
It was a personal argument but it was aimed at a lack of perspective, and not knowing what he doesn't know (a chronic failing of youth, and I'm young myself), rather than an attempt to insult him for it's own sake. As you're hinting at, rather than being upset at me calling him 'some kid', maybe he should wonder why his age is so easy to peg.
He didn't seem to grasp why working with streams of bytes is important in this context, and why extremely spotty/inconsistent support for it is not sufficient. If he had a broader base of experience, or if he was inclined to listen to those who have such experience, maybe he'd get it.
> They are using lua. They just have a front end to lua that uses javascript syntax, it would seem.
Regarding the limits of how far this Lua / javascript combo can or should take us I refer you to the words of Roberto Ierusalimschy, creator of Lua, himself (I agree with him):
> I must confess that I would be very reluctant to board a plane with flight control implemented in Lua or any other dynamic language.
I think if the goal is to make little gadgets that put large block letters over jpegs of cats and post those jpegs to social media sites, Tessel's software side is on the right track. It might not be so great for "internet of things" gadgets that interface with the real world, in real time, in novel ways.
(in fairness their hardware looks interesting, though currently overpriced)
I think the market this is aiming at is more arduino scale. Would you board an airplane powered by an arduino? Or an apple II, for that matter- Another product aimed at making something more accessible to "software" type people.
Except I'm not sure how well that would work since it would then be compiled down to Lua...
And of course, C++ isn't exactly a great choice for making hardware programming more accessible. I hear tell that's how a lot of it is done already. ;-)
You know, I only know JavaScript about a fraction as well as I know C++, and I can say with a great deal of confidence here... no.
C++ has lots of WTF's in strange little corner cases (mostly around the C compatibility issue). C++11 has done a good job of cutting down WTF's. In JavaScript you needn't go to the corner cases... it's right there in the normal cases.
We're talking about a language that still doesn't have a standard way of reading and writing bytes... for talking to hardware. Think about that for a moment.
> Don't be hating just because these new kids don't have to bang their faces into machine code or shitty C macros--it's great to have a way of trying different programming techniques for embedded systems.
Honestly, I'd not have blinked if the idea was to use Java (even with its lack of unsigned arithmetic), C#, Objective-C, Lua, or even say Go to do the job. I mean, you need to use some kind of a language and you want it to be something that a lot of programmers will find accessible.
But picking JavaScript for this job is kind of like picking Forth to build the new, more accessible database query language...