"But along with the cash, Watsi has also raised eyebrows. To some critics, there’s something distinctly neocolonialist and off-putting about the spectacle of well-off do-gooders in the U.S. choosing which brown people live and die in the developing world based on who has a cuter picture on Watsi. Others wonder whether focusing donations on individuals, no matter how worthy, diverts funding and attention from efforts aimed at tackling the more systemic causes of inadequate healthcare in impoverished parts of the world. Watsi must also bear the misfortune of coming of age during a simmering backlash against Silicon Valley. We’ve gotten tired of hearing the name brand Silicon Valley bigwigs who have invested hundreds of thousands of dollars in Watsi talk about the merits of “disrupting” existing industries, when all that really seems to end up happening is that a few people get rich while the competitive screws tighten on the many."
You have an urge to save a life? Make a direct donation yourself or to one of the hundreds of charities that exist. Some charities are certainly better than others, but the options are out there. Watsi is a pathetic excuse for a "company" where this sort of pick-and-choose comes across as a form of good when it's really quite creepy and saddening.
Startups don't need to exist in order to do good, and you don't need to hide behind the veil of backing a company in order to seem like a good person. Give someone the money yourself or go visit those countries if you really give a shit, because there's more to their problems than individuals who happen to need medical care. Most of them don't have any at all.
Watsi is simply another avenue to accomplish goals - just because you personally find them "quite creepy and saddening" reflects more upon yourself than Watsi. It's remarkable that you find Watsi "pathetic" simply because you disagree with their means.
I am a homosexual. I donate to the Salvation Army on occasion. The Salvation Army is known for condemning homosexuals along with a litany of other things I truly disagree with. However, having been to Harbor Light/Acres of Hope in Detroit and seeing that they are willing to provide shelter and food to those in need allows me to support them, in some way. The concept of "this isn't ideal - I want NOTHING to do with it" is simplistic and doesn't reflect real-world situations.
I have found that torch-bearing zealotry with regards to ideals only works in the abstract world.
If you're going to be this nasty, you'd better be right, and in this case you're not. If I went in person to Nepal to find people who need help, I still wouldn't do as good a job as Watsi.
The one useful thing about your comment is that it gives us a way to measure the background radiation of mean-spiritedness online. Watsi is about as good as anything ever gets. The founders are as selfless as anyone I've known. And they are not merely unrealistic do-gooders. They know well what things are like in the field. If people wholly focused on doing good and very effective at it still manage to get attacked online, that's therefore the baseline for being attacked online.
From what I've seen, no matter how much of one's life is spent trying to help, people who stick out will have detractors. And there will always be some lazy bum who thinks they know how to do it better... but who isn't actually doing anything at all.
Personally, I think the best response to such is show me. If you can do it better, JUST DO IT instead of telling me how much better your ideas are. Because mostly those people just want to be heard when they have half-baked opinions and no experience whatsoever.
It still doesn't mean that their implementation is right, because it's not.
Here, let me elaborate: Stop trying to tie yourself to the individual and look at the greater problem. It's a west coast elitist mentality that gives you the "hey, look at this person I'm helping out!" Ever see those commercials they've been doing for decades now where you make an generic donation, and the charity sends you photos of people you're helping? That's a much better approach.
You might be genuinely interested in helping others but other people might not. People are not equally determined in overcoming any obstacle in their way to help someone in need. So making it easier for people to help would let more people receive help.
Do you agree this would be helpful to more people?
What striked me as interesting in your comment is how passionate you are about it. It's the kind of comment someone honest and genuinely caring makes after they've been burned. If someone treated you unfairly it can become an impetus for you to do great things.
I also wish HN readers tried to be more understanding.
It sounds like you're living on a different planet. Watsi allows people who are dying to raise money for treatment that will save their lives, which they otherwise couldn't afford. It's a direct intervention - high impact giving. Few charities have such direct impact.
Many religions/philosophies claim lack of compassion comes from lack of knowing. In that regard, if putting pictures and stories and whatnot on this site gets people more knowledge of poor people's plight, then it can invoke compassion that wouldn't have occurred from just looking at a Unicef box. From a clean engineer point of view it seems stupid to have another company when we already have charities, but another company simply means more people and more PR and more stories getting out there to potential donors, so I don't think it is particularly harmful.
>You have an urge to save a life? Make a direct donation yourself or to one of the hundreds of charities that exist. Some charities are certainly better than others, but the options are out there. Watsi is a pathetic excuse for a "company" where this sort of pick-and-choose comes across as a form of good when it's really quite creepy and saddening.
The "pick-and-choose" mechanism Watsi uses is a (pretty brilliant) psychological hack to grab attention and elicit empathic concern [1]. I suspect a lot of Watsi's donors would not have donated to a similar cause otherwise; if so, Watsi isn't competing for their money with traditional charities.
I would love to see the statistics on how many of the people donating to Watsi are first-time donors to any charity.
[1] Phrased this way it might sound like a bad, manipulative thing to do but I don't think it necessarily is one -- at least in this case.
I especially don't think it's a bad thing because it's a hack that provides significant value to the donor. My wife and I made a small donation (we are fairly poor by US standards) a while back, and I teared up when they emailed me to tell me the girl was fully funded a few hours later. I probably would have gone back and donated to another case right then if they'd had any left (it was right after the Boston Marathon bombings, so I think they were seeing a spike in donors).
This whole thing is pointless because it doesn't do anything useful. There is literally no difference between using the website or simply asking your friends if they know anyone you might hit it off with, and in Facebook's case, there are plenty of superfluous "friends" people have who may actually be terrible people. They allege "social pressure" in these cases, but on the whole it's a stupid term that generally doesn't mean anything.
OkCupid is still the pinnacle of online dating and probably will be for a long time, because the whole point of dating is that you actually go out with someone and see if you hit it off. Just because someone is a friend of a friend doesn't mean they're any better or different than Random Person A, and if anything, you're greatly limiting your pool of options. And lastly, if you're having trouble meeting someone and don't live in the sticks, the problem is likely you.
I tried a few sites in my day and I'd have to agree that OKC is the best I've tried, even if I didn't have luck with it. It was the most pleasing to use and it seemed like the people were of quality.
I'd like to see a site where an algorithm is used to match interests only, if you match there, then photos can be viewed...or maybe not, but at least rely heavily on the interest metric. Maybe a site for intellectual-leaning people (or however one would want to word that) would also do well, or one that relies on Myers-Briggs (though I know that's not for everyone).
I just quickly installed this in a VM and it's literally just a poorly done OS X skin laying on top of Ubuntu, all the way down to dock icons bouncing when clicked. The installer is the standard Ubuntu-derivative fare without much modification whatsoever.
It's not really much of an upgrade from just running Xubuntu or one of the more minimalist window managers out there.
A bad idea run by someone who has no clue about what he's doing combined with another bad idea supported by people who have no clue about what they're doing? Sign me up.
Both nutrition and pecuniary economics are far more complex then Soylent and Bitcoin allow for -- additionally, the history of both of these fields is seeded with innumerable failures to couch each of their fundamentals in terms of simple concepts (for nutrition: protien/fats/carbs, vitamins, micronutrents, and so on -- for economics, Smith, Marx, Keynes, Laffer, et cetera). The idea that either, as a phenomenological consequence of human interaction, has a simple solution is technically possible but foolhardy.
I don't deny that there may be a food that is easy to make but perfect for humans; or that there may be such a currency as well. But for any rational actor to believe it would require not only extraordinary evidence, but specific refutations of the previous failed attempts (rather, the theory would provide those refutations). Such a failure to both acknowledge and rebut historical failures in the same vein is strong evidence of crankism/crankitude/crankosity/I Can't Believe They're Not A Crank.
Somewhat ironically, the foodstuff I might be most inclined to believe would satisfy this requirement is exactly what "soylent" should be -- complete raw human. Om nom nom.
If bitcoin circulation is, in practice, much lower than it's generally believed to be -
“We isolated all the large (≥ 50,000 Bitcoins) transactions which were ever recorded in the system, and analyzed how these amounts were accumulated and then spent. We discovered that almost all these large transactions were the descendants of a single large transaction involving 90,000 Bitcoins which took place on November 8th 2010, and that the subgraph of these transactions contains many strange looking chains and fork-merge structures, in which a large balance is either transferred within a few hours through hundreds of temporary intermediate accounts, or split into many small amounts which are sent to different accounts only in order to be recombined shortly afterwards into essentially the same amount in a new account.”
- Dorit Ron and Adi Shamir, Quantitative Analysis of the Full Bitcoin Transaction Graph.
Then the perceived value and the actual value would be off significantly.
So, while I might not phrase it in quite as strong terms as the person you're responding to does, I do find reason to be somewhat cautious about the whole enterprise.
Don't recall who said it, but powerful disruptive innovations are overhyped in the short term but under-hyped in the long term.
That's how I feel about Bitcoin. Circulation might not be a lot, and signs probably show that is not a lot, but it is disruptive. You also have to consider that perceived value is always reckoned into the 'actual' price. It depends on what market theories you ascribe to, but it is generally accepted that current prices reflect the market's capability to guess (and judge) futuer value. So, perceived value is often difficult to 'divorce' from the actual value, especially in the case with something like Bitcoin, where it is seen as both an asset and currency.
Another thing is the continuing trend of providing off-chain transactions (through companies such as Coinbase en Inputs.io).
Study wasn't done in 2010. That's just when that transaction shell game showed up from the perspective of the study. Do you have an up to date statistical analysis of the bitcoin transaction record to share?
2. Bitcoin can be "attacked" (for some vague notion of what it means to attack a system that has no security definition) in polynomial time.
3. The economics of Bitcoin are extremely suspect and based on a poorly developed economic model, supported by neither the Austrian school nor by modern monetary theory. This is probably the underlying cause of the lack of a security definition, as the security definition of a digital cash system will almost certainly be driven by the system's economic model.
These issues have been covered ad nauseam by many people, and have been largely dismissed by the Bitcoin community. All the while we have seen increasing amounts of energy sunk into Bitcoin mining, we have seen a major block chain fork that left transactions in question, and we have seen wild fluctuations in the value of Bitcoin currency.
It's hard to downvote a thoughtful comment from a smart person, but you're just plain, flat out wrong.
Bitcoin is a heuristic for Byzantine concensus that requires 50% + epsilon of mining hash power to subvert.
This is not explicitly stated in the original paper, but the original paper comes very close.
I don't know what else you would want.
Regarding economics: All you need for a functioning currency is a supply of speculators who will bet on its future value and thus prop it up. That is only sustainable if the currency is the best at what it does for some meaningful niche (e.g. untrusted digital transactions). This is not part of any proir school of economics, but it's close to common sense, and is strongly supported by the success of bitcoin thus far.
You are already giving an unclear definition when you use terms like "mining hash power." What rigorous definition can you give for that term? What sort of model of computation are we working with (or is the definition based on information theory?)?
For comparison, consider the definition of security used in this recently published paper:
The definition is long, but very clear. Probabilistic polynomial time Turing machines are used as a model of computation; the adversary is given more power by being allowed to be non-uniform (i.e. the adversary can be a different machine for different security parameters). The security properties are clearly defined in terms of this model of computation, and a construction is given and proved to meet the properties (under certain hardness assumptions).
Note that there is no possible way that Bitcoin could meet the security definition given in that paper, because that definition requires the existence of a bank that issues the cash. This is true of previous work on digital cash as well, including the work that preceded Bitcoin. That is why it is necessary to develop a security definition that makes sense for systems like Bitcoin -- digital cash systems in which there is no bank. That is the complaint I have: no satisfactory definition has been given.
"This is not part of any proir school of economics, but it's close to common sense, and is strongly supported by the success of bitcoin thus far."
Economics often defies common sense, so I would be wary of using common sense as the basis for a currency.
2. Countries can be attacked too. I don't know what your point is here. If bitcoin survives its infancy, it could become more resilient than any other currency.
3. Could you explain it in practical terms? Because as long as it works, why should I care about what theorists say? Bitcoin's creator said himself that it was an experiment, and rightly so, we have never seen anything like Bitcoin taking off before. Seems like he nailed it on his design.
Yeah, I have seen a lot of people raging. How is that evidence of anything, other than fear of the unknown?
Typically cryptographers will define a security goal in rigorous terms, then prove that their system meets that goal (at least for "cryptomania" type applications like digital cash). There are good reasons for doing this:
1. It makes proofs of security possible, which make us a lot more confident about cryptosystems.
2. It allows us to be clear about what it means to "break" a system. If we are not clear about this, someone could claim that their system cannot be attacked by simply defining security to be the exact behavior of whatever they created. This is analogous to having a falsifiable hypothesis in a scientific experiment.
The original Bitcoin paper did not have such a definition. I am aware of one attempt at making such a definition, but it resulted in a very weak notion of security that placed unrealistic restrictions on what an attacker could do (basically, the authors were trying to find some definition that Bitcoin could satisfy; see point 2 above). In general, Bitcoin's security is highly suspicious, since by design the honest parties must scale their work in proportion with the work done by the attacker.
"Countries can be attacked too. I don't know what your point is here. If bitcoin survives its infancy, it could become more resilient than any other currency."
If you are admitting the possibility that an army might attack a country, then you are allowing Bitcoin to be fractured by an army destroying the outgoing Internet connections of a country. In the past, countries have been cut off from the Internet by accident (e.g. anchors being dropped on undersea cables). You could keep an attacking army outside of your territory and still wind up unable to communicate with the rest of the Bitcoin network. This is not a highly convincing argument.
Of course, this is all irrelevant, because a polynomial time attack is not the same as an act of war. There are a lot of organizations with the resources needed to perform the "51% attack" on Bitcoin and no compelling reason to think that a faster attack is not possible. You could attack Bitcoin by performing a lot of computation locally, without ever needing to step foot out the door. Bitcoin also does little to prevent attacks based on sending malicious messages into the network, despite the fact that cryptographers began developing techniques for dealing with that decades ago and despite the fact that almost all that work is freely available.
"3. Could you explain it in practical terms?"
Sure. Let's start with a thought experiment: I have something very rare, which has no practical uses but which is easy to give to others. Will you give me your car for a big pile of it?
Unless you are crazy you would not give up your car. The reason is that you are receiving something that is useless in exchange, and that you would have to go find some other person willing to take a pile of useless (but rare) items. Would your landlord accept some of these rare items as a rent payment? Would the bank accept some as loan repayment? Would the government accept it as a tax payment (let's just pretend that you are a law-abiding citizen who pays their taxes)?
How is Bitcoin any different? A lot of hype was generated about it, but at the end of the day you will not be able to pay your taxes with it, banks are unlikely to accept it for loan payments, and the majority of businesses that claim to accept Bitcoin payments actually accept payments in fiat currency via a Bitcoin exchange. Bitcoin currency has no practical uses (it is basically an energy sink) so the Austrian school does not support it, and it is not legal tender nor is it accepted for tax purposes by the government so modern monetary theory does not support it either.
"why should I care about what theorists say?"
For the same reason you should care about what cryptography theorists, physicists, and doctors have to say.
"we have never seen anything like Bitcoin taking off before."
That is because Bitcoin is the first of its kind. It is the first attempt to create a currency without a legal system. Even gold had value as currency because of a legal system.
"Yeah, I have seen a lot of people raging. How is that evidence of anything, other than fear of the unknown?"
You are calling the informed opinions of dozens of experts in cryptography and economics "raging" because they are saying that the system you love and support is based on dubious technical and economic ideas. It sounds more like you started out believing that Bitcoin is the future and are not willing to accept any argument that concludes anything else.
> There are a lot of organizations with the resources needed to perform the "51% attack" on Bitcoin and no compelling reason to think that a faster attack is not possible.
I'm aware of that. There are also lots of organizations with the power to kill you, yet you won't lose your sleep, because they have nothing to gain from that, so you are pretty sure that it won't happen. It could be argued that the central banks have a lot to lose to Bitcoin, so they should attack it. But after giving it a bit of thought, I remembered that most people just try to pass the current problems to the next guy (think of presidents, bankers, etc), so why would they bother? Right now, Bitcoin is not big enough to be a threat to anyone. It will keep growing as a threat, but everyone will pass the problem to the next guy, until Bitcoin becomes too big to be stopped. If there is a future that makes sense, this is it. Bitcoin takes over because lazy politicians don't do what they have to (in this crazy world, where their function seem to be to ruin everything), and ironically, that will be the best for everyone.
> For the same reason you should care about what cryptography theorists, physicists, and doctors have to say.
I was thinking about economists mostly (which in many cases will have vested interests), but still, if it works it works. Many theorists have spoken against the phone, the internet, the email, the aeroplanes, etc, and see what happened. I'm no cryptographer, but Bitcoin doesn't seem like something that would require one to begin with since all its crypto was done at the user level, it's pretty simple. Satoshi didn't try to create his own hashing algorithm or anything like that.
> You are calling the informed opinions of dozens of experts in cryptography and economics "raging" because they are saying that the system you love and support is based on dubious technical and economic ideas. It sounds more like you started out believing that Bitcoin is the future and are not willing to accept any argument that concludes anything else.
There are quite informed people on the other side too. So what do we do about them? Ignore them? I said raging, because Bitcoin has this crazy effect on a lot of people. They will hate Bitcoin for no reason, spread outright lies, and try to convince everyone that it is a scam. Why? We still don't know what causes it, so we just call it fear of the unknown.
"Satoshi didn't try to create his own hashing algorithm or anything like that."
No, he tried to create his own digital cash system, and digital cash is a cryptography problem that has been extensively studied by cryptographers (and had been studied for decades prior to Bitcoin). Bitcoin is also a system that involves multiparty computation, and secure multiparty computation has also been studied extensively by cryptographers, also going back decades. It is a mistake to think that the only relevant cryptography in Bitcoin are digital signatures and hash functions.
This is really the crux of the issue here. Bitcoin is not a hash function. It is not a digital signature system. The security of hash functions and digital signatures is not in question here; Bitcoin could be vulnerable to attack even if it is built using secure hash functions and secure signature systems. The point of having a security definition is to be clear about these things. We need to be clear about what the meaning of "security" is in the case of Bitcoin if we want to make any statements about whether or not Bitcoin actually achieves that security goal. It is not hard to see that the definition of security for a hash function or a digital signature system is not what we want for Bitcoin; what is not so clear is what we actually do want.
> Sure. Let's start with a thought experiment: I have something very rare, which has no practical uses but which is easy to give to others. Will you give me your car for a big pile of it?
How does the success/value of gold not completely destroy this line of reasoning?
I'm not saying it is destined for success or failure, just that this is not a coherent argument against Bitcoin.
This line of reasoning is inane, and comes up every time gold or Bitcoin is mentioned.
Gold and Bitcoin are merely stores of value. Just like your paper bills with dead presidents on them. Or stocks or bonds. Which has next to nothing to do with how you pay for something. You can easily barter for an item, or pay with credit cards. Some places still do not accept American Express. Some places don't accept any credit cards. That doesn't take away from the fact that they are convenient. Likewise, I can see that Bitcoin could eventually become very convenient for micropayments, since credit cards charge merchants a fee. AFAIK, Bitcoin transfers are cheap or entirely free, thus making micropayments possible.
Currencies are not just stores of value. Real estate is a store of value also, but nobody uses land or buildings as currency.
Once upon a time, gold was used as currency. Merchants would have scales, weights, and other equipment needed to deal in gold. That is not how gold is used in today's world; there are only a few highly niche markets where gold is used as currency, and everywhere else you have either fiat currency or currency that is backed by gold (with fiat currency being vastly more popular these days).
It is also wrong to separate printed money from the rest of the money supply when you are talking about fiat currency. Paper money is just a representation of the currency; the value of a dollar bill is equal to the value of four quarters and equal to the value of a bank account with one dollar in it. Fiat currency is an abstraction created by laws, which is implemented in various ways (paper money, coins, electronic transactions, etc.).
The Yen and the Pound are currencies in certain markets. Lumps of gold are used as currency only in highly niche markets. I did say that gold is a currency in such markets, but it is not generally used as currency in most of the world.
4. Last time I checked, gold was still very valuable. And it has been used for what... Thousands of years? The economies of the world did just fine without forced inflation and consumerism.
5. Printing paper money wastes way more resources. And how is using energy to create something of value a bad idea anyway? Bitcoin doesn't even require that you use a contaminating type of energy, for all I know you could be hashing with solar energy.
Most transactions now do not involve cash, and credit card transactions are in fact far more energy-efficient than bitcoin transactions. And it is not just using energy to "create something of value"--it is an artificial waste of energy barrier.
Furthermore, there will be an energy cost to transactions even after no substantial number of bitcoins are mined, since just verifying transactions requires wasteful hashing.
The solar energy part is totally irrelevant--if you happen to have a solar panel sitting around, you could just as easily use it for actual work (and offset coal burning) rather than computing hashes.
> credit card transactions are in fact far more energy-efficient than bitcoin transactions
Huh? You seriously think Visa/Mastercard are using less energy than Bitcoin?
> Furthermore, there will be an energy cost to transactions even after no substantial number of bitcoins are mined, since just verifying transactions requires wasteful hashing.
You can call it wasteful all you want, but if you don't see the value in creating/maintaining a global framework for storing and exchanging value, then I don't know what to tell you. Besides, you are comparing apples to oranges. Bitcoin is a currency. You can build things like Visa around Bitcoin.
Gold being valuable isn't the point. In fact, I think calling bitcoin deflationary indicates the belief it will actually be more valuable in the future.
Question for you regarding your reference to forced inflation and consumerism: Do you believe there would be even a temporary economic collapse if the world were to suddenly move to a deflationary currency? If so, how many months/years would the collapse have to last before you would say that the transition to a deflationary currency isn't worth it?
The reason I ask is because while fundamentally I have a lot of problems with our financial system, I've come to the conclusion that historically there were periods of recession that lasted hundreds of years. I just don't feel that my lifetime in a bad transition economy is worth the sacrifice of moving to a "better" financial system. I'd rather just keep the status quo as it slowly degrades to something worse and worse since I think that's the best outcome for me. It's a selfish outlook, but I have to believe it's the outlook of most people and that's why we are where we are right now.
re: 4. Deflation is bad for the asset rich, but good for the asset poor. Inflation is good for the rich and bad for the poor. Those who usually spout the 'inflation is better than deflation' argument are the rich.
The argument that inflationary expectation is better than deflationary expectation because during period of deflation purchasing stops because 'it will be cheaper tomorrow' is flawed because such periods of deflation are short lived, and the market returns to a price where people are willing to re-enter the market and purchase to gain utility from the goods and services.
Just be careful believing people who "know anything about money" without thinking it through.
No, deflation is bad, period, for the economic actors who depend on deflationary currencies, and this is verified simply by looking at pretty much any historical instance of deflation.
I'm not against Bitcoin per se, but all this anti-inflation crap is, quite simply, at odds with historical data to the point that it has become essentially "views differ on shape of planet". If economics is to be useful at all, it has no choice but to be empirical, and everything we've ever measured has always told us that deflation is bad.
Perhaps Satoshi simply felt it was easier to implement finite Bitcoins than a constantly increasing supply. I don't think he can be blamed for that. Bitcoin is, if nothing else, certainly intriguing.
You are confusing the short lived deflation that could be caused by a deflationary currency, with the deflation caused by all the other problems mentioned in the wikipedia article, like "technological progress that created significant economic growth", "great advances in productivity", etc. Apparently the only problem brought by gold and silver was that there was a scarcity of coins (a physical problem), which would never happen with Bitcoin (a digital currency).
> Perhaps Satoshi simply felt it was easier to implement finite Bitcoins than a constantly increasing supply. I don't think he can be blamed for that. Bitcoin is, if nothing else, certainly intriguing.
No, he was against inflation. In the first block of the blockchain, he added a message; it was something about Bernake approving a new bailout for the banks.
> No, deflation is bad, period, for the economic actors who depend on deflationary currencies,
USD (or GBP or AUD etc) are deflationary relative to technological items, such as computers. By your theory, no-one should buy such items. In reality, they do.
You absolutely have 4 backwards. Deflation is good for those rich enough to have money sitting around, and who can make money just by sitting on it and not investing it. Inflation is bad for the rich because suddenly they have to DO something with the money, or it disappears.
Inflation is good for the rich's income. Deflation is good for the rich's savings. Seems like they win in both cases? Though I would argue they win more in the first case, since inflation don't necessarily force them to do something with the money. Eg: They could just sit on gold or real estate if they wanted to. But most rich people didn't get there by doing nothing, so they tend to do something anyway, no need to force them.
A deflationary currency can be a bad idea as a monopolistic currency.
But adding a deflationary currency to an ecosystem that also contains inflationary currencies isn't the same thing as having a deflationary currency be the only currency available.
The presentation is very well put together, but CoffeeScript is a hack built on top of JS in order to use Ruby semantics. You're correct in that it's not really used outside of Rails, and not knowing JS proper (not that I personally recommend using it for much more than jQuery and a couple other libraries, when you absolutely must) poses a problem for when someone runs into the 97% of code that is JS.
"Not really used outside of Rails" - Do you have any source to back that up, or is it simply what you've seen? I first used CoffeeScript in a Python shop, personally.
JavaScript "developers" feel the need to not use any other language than JavaScript, so they decided to shove it into as many unnecessary places as possible, even if their code more frequently ends up being an unmaintainable mess with no noticeable performance gains.
You are so right. Why try new things and test new ideas. Everything we have is perfect as it is! No need for a car, walking will get you there! We don't need moving pictures, books are fine!
Do you even know anyone that works in Javascript? I happen to like JS but pull out PHP or Python or whatever I need to get the job done when I have to. And I'm not an exception.
I've seen your flawed argument used more and more often around here lately. It's pretty disappointing to see.
Somebody speaking out against deeply broken technologies or bad ideas that have received a lot of hype lately does not mean that such a person "hates progress" or is "resistant to change" or any utter nonsense like that.
Likewise, adopting so-called "new ideas" or "new technologies" does not necessarily mean that real progress is actually being made. JavaScript and NoSQL, for example, are generally regressions in most respects, even relative to 1990s- or 1980s-era technologies.
True progress happens when we move beyond our current abilities. It is not progress in any sense when we start using inferior programming languages or databases, for example.
I always seem to see you in any thread involving "new" technologies that you find "inferior", most of which are not very new and, like all technologies, are tough to put on a precise inferiority/superiority scale in an objective context-insensitive way. I'm beginning to move past annoyance to pure curiosity - why do you care so much what technologies other people find useful? In what way does people's enjoyment of Ruby/Rails, Javascript, NoSQL, etc. harm you so much that you have decided to come and be nasty any time anybody speaks positively of them?
Software doesn't exist in a vacuum. It often has a lifespan far in excess of the involvement of the original developers. It can also have a very serious impact upon its users, its subsequent maintainers, and any organizations they may belong to.
I could not care any less if hobbyist developers want to use Ruby on Rails, JavaScript and NoSQL for own their personal projects that nobody else ever uses or has to maintain.
It's a different situation when such objectively-flawed technologies are used beyond that, however.
The broken software you or others write today using such horrid technologies may very well end up being inherited and maintained by me or one of my teams later on. We won't be happy when we have to waste time, effort, money and opportunity dealing with it and its flaws.
There are numerous, far better options out there. There are just no excuses for using poor technologies these days.
Your goal of seeing higher adoption of technologies you like and lower adoption of those you don't makes sense but your approach is misguided. I rarely see you speak positively of technologies you like, rather than negatively of those you don't. I rarely see you reasonably describe the shortcomings of technologies you don't like from a place of apparent expertise, rather than making ungrounded categorical statements ("hobbyist", "objectively-flawed", "horrid", "poor") about technologies that it doesn't seem like you have bothered learning about in any depth except that necessary to confirm your biases.
Plenty of people would prefer to inherit my broken software using horrid technologies than your broken software using different horrid technologies, and vice-versa. This unnecessary us-versus-them-ism in technology drives me crazy.
NoSQL is a regression in a sense. Or, if you prefer, a conscious trade-off. Providing the same guarantees as traditional relational databases becomes difficult at scale, so you trade some of the features for higher scalability. For example, you can have a very highly available, distributed key-value store, but with no transactions across multiple rows, no query language, no schema, no indexing, etc.
If you think about it, the only real (and most important) advantage of NoSQL databases is that you can make them scale. If only we could make relational databases work like that, I'd happily shove all the data into a store supporting schemas, a powerful query language and ACID transactions.
It's not quite what I had in mind. RethinkDB schemas are not strictly enforced, and it does not have ACID transactions. I'd much rather see an open source implementation of Google's Spanner, but that's probably a long way off.
I can tell you why I dislike programming in JavaScript (those aren't necessarily faults of javascript itself):
1. Complex code usually contains some horrible callback spaghetti, which is not pleasant to debug at all.
2. General flakiness, or, as I call it, the "WTF happened?" syndrome. I run into this all the time. Change or add something - suddenly you find the web page broken. Okay, so there's a bug somewhere. Click developer tools - console... blank. No error messages, nothing. Just you and the broken web page. Just the other day I was trying to integrate a select2 input field into some angular.js-powered page and ran into all kinds of weird issues. But why does it even have to be a problem? Why can't I just stick a new control into a web page with a click and maybe a few lines of code? Hey, remember those things called "Visual Basic" and "Delphi"? UI programming with these was a blast, all the controls looked and behaved the same across all applications. And then there was ActiveX. The idea behind ActiveX is almost futuristic compared to the shit we have to put up with.
3. Type system. Made a typo in a variable name? Bad luck, Javascript will introduce a new variable and you'll have fun chasing a non-existing bug. Oh, look, here's a function: "function foo(param)". I wonder what does it do? Unfortunately, the previous maintainer has not left much comments, so now I have to dig through the implementation to find out what the parameter should be and what values the function may return.
These are my main issues with javascript. I hope sometime in the future we'll throw out the whole Javascript/HTML thing and start writing web application front-ends in a language that is more geared towards UI programming. I can imagine having a nice declarative language (instead of HTML) to describe what the interface looks like, and a statically typed imperative scripting language to describe UI behavior. It would also be awesome if these technologies allow for painless integration of independent UI components (a bit like ActiveX controls). But those are just dreams...
I hear you. I have extensions to your points and disagree here or there but, in general, all your points come down to the same thing: the developer toolchain for JS really frickin' sucks.
Web development is a pain. Web developers have to deal with at least five separate technologies to get anything non-trivial to show up on the page. HTML, CSS, JS, server-side language, persistent store. Each one brings its own config languages or preprocessors, maybe some kind of build system, and, of course, mo' tech mo' problems as they say.
(An aside: I think select2, specifically, is really beautiful and an absolute pain to actually use with any other framework in real usage)
There's some truth to the argument that JS development is merrily rediscovering development methodologies pioneered decades ago:
1. "Yay! With RequireJS I can do real dependency management!"
2. "Did you see that article on how to do conditional breakpoints in Chrome dev tools?"
3. "Using type annotations in the Closure Compiler let you add some kind of type checking!"
All that said, it is getting better:
1. Always use a linter. JSLint if you wanna cry, JSHint otherwise.
2. Callback spaghetti in JS is equivalent, in my mind, to cryptic one-line pointer arithmetic in C/C++. It's a symptom not of the language itself but of the programmer's hostility to future maintenance.
3. Declarative widget-y tech is the future of web front-ends: between AngularJS directives and the evolving Shadow DOM specification (to name two), we're moving in the re-usable component direction.
We're never going to throw out JS. Ever. Every browser vendor would have to simultaneously switch to some staggeringly amazing technology all at once as well as convert all the old browsers as well. It's not going to happen.
JavaScript, if measured by installed runtimes, is the most popular, wide-spread language on the planet. Count the devices in your home that can run JS. That's its true strength, I think.
It isn't necessary for everyone to switch to something new at once. If one big browser accommodates a different language, someone will use it. If it is popular, it will be adopted more broadly.
Not to mention, it can be compiled to javascript at first, for backward complatibility. Actually, Google's Dart comes to mind, but I haven't really used it to make an informed judgement about it...
The problem with javascript and the tooling related to it isnt something to do with the existence of great tools. Ofcourse, tools that are pretty mature and really good, as found for other languages dont exist for javascript ( like the tooling for java for ex ) but that doesnt mean there is nothing out there.
I use Intellij and I have noticed that I dont do any mistakes likes missing commas, accidental global declarations etc because my IDE ( intellij ) tells me about them. It infact corrects me when I make any such mistakes. The real problem is with people still using simple text editors to write javascript code because they dont think there is any advantage of using a full fledged IDE which does anything , because they think that there is nothing out there which actually helps.
The numerous flaws with JavaScript (the programming language) are well-known and well-documented. A few search engine searches should bring up ample information.
As for its "greatest flaw", I think that may be the JavaScript community, and the general attitudes within it.
There is, unfortunately, a very high degree of ignorance within the JavaScript community. There are far too many JavaScript programmers who only know JavaScript, or an equally-horrid language like PHP. Having such a limited world view, they don't realize how inherently bad their tools are, and they don't realize how much better they could be.
This ignorance has many side-effects. One big one is that we see a near-complete lack of improvement of the language itself. Any changes that have happened never really address any of the serious flaws with the language.
Another side-effect is that we see JavaScript used in ways that it shouldn't be used, in places that it shouldn't be used. Large browser-based applications and server-side applications (of any size) are two good examples. Asm.js is another. Emscripten is yet another.
There are various other issues with the community, their attitude, and their ignorance, too. We could go on for a very long time about this.
The JavaScript community ends up earning a lot of animosity, if not outright scorn, from those developers who have experience with many programming languages, and who have spent years, if not decades, developing production-grade software in a much more sensible, proper manner.
I don't think that anyone would really care if JavaScript users used it solely as a hobby. But the moment they try to use it professionally, for real-world software systems, they'd better be prepared to defend themselves and their technological choices. They can't bring their amateurish programming language and ignorance to the table and not expect to be treated harshly.
You haven't outlined any points here about the language. I don't want Google to tell me what other people think its flaws are, I was asking you what in your experience are its flaws. Looking at your past comments on HN, you seem to leave comments like this a lot.
I guess I'm looking for lists like this:
C/C++
1. The preprocessor allows for horrid misuses with a broken "macro" system that doesn't deserve the name compared to Lisp's.
2. The preprocessor #include system makes compilation slower and more complicated. So much so, in fact, that Google invented a language, in part, to get around it.
3. Dynamically linked libraries are a joke that have no real use in production software.
4. The language syntax is complex enough that creating good parsers for it is extremely hard leading to bad error messages in most compilers.
Why do you think that the flaws that I've personally experienced with JavaScript differ from those that others have experienced and already documented? The flaws are there regardless of who is using JavaScript.
I'll list some of the most obvious and serious ones for you, since you seem incapable of finding this basic information on your own:
- Its type system is horribly broken.
- Its scoping is horribly broken.
- Its comparison operators do not behave sensibly.
- Its prototype-based OO system is impractical, and quite poor compared to other prototype-based languages.
- Its lack of class-based OO leads to awful hacks using its awful prototype-based OO functionality.
- Its lack of support for proper modules and namespacing makes large-scale software development tedious.
- The fact that something as obviously-dumb as semicolon insertion is even conceived of and supported in the first place.
- It's so rife with other bad language features that one of the most widely-respected books about it, Crockford's "JavaScript: The Good Parts", is all about not using large parts of the language.
- Its standard library is extremely limited, and what does exist works quite poorly.
- Its tooling (editors, debuggers, profilers, etc.) is lacking in many respects, and is often entangled within web browsers.
- Its performance is lacking.
- Its community is generally inexperienced and incompetent, and produce a lot of very bad code.
- There's little evidence that things will improve in the future.
While other languages have flaws, none (aside from maybe PHP) have as many utterly stupid, unnecessary and unjustifiable flaws as JavaScript does. And at least these other languages make some real effort to eliminate such flaws, as well. We just don't see that from the JavaScritp community.
Nonsense. There's a huge, and very obvious, difference in the overall level of experience, ignorance and competence within the various programming language communities.
The JavaScript, PHP and Ruby communities have an abundance of ignorance, often due to a severe lack of experience. These are the most-hyped languages, and the ones that new developers often flock to. It's quite obvious why so many bad decisions (like using these languages in the first place, or using NoSQL databases) and so much bad code comes out of these communities; their members often just don't know any better, and often aren't willing to learn.
This is much less of an issue within the communities that attract experienced and competent developers. We're talking about C, C++, Python, Haskell, Erlang, Scala, and even Java and C#. Thanks to the wider and deeper experience that the developers in these communities tend to have, we see far fewer blatantly obvious mistakes being made. That's not to say they don't happen; they do. But the quality of the software that is produced is generally much better than what we see produced by the JavaScript, PHP and Ruby crowd.
It's easy to pretend that these very real differences don't exist, but the reality is that they do.
The opinions of one who writes "Javascript 'developers'" are probably not worth knowing. Javascript has stupid scoping rules, and some other obvious flaws, but it's not flawed as in unusable, and it's certainly not flawed as in "'developers'".
The problem with your argument is that those who know programming languages other than JavaScript end up seeing its flaws extremely clearly. Such people won't willingly subject themselves to something that is so inherently broken, especially given that they're aware of the alternatives. Many would feel quite ashamed to associate themselves with it in any professional manner.
Some of us want to get things done rather than sit in an ivory tower being pompous. The notion that a person is not a developer simply because they expanded their knowledgebase to include a language you dislike is asinine.
Plenty of people who know multiple languages use Javascript every day. Of their own free will. Because they want to.
What is it about JS that you find so broken that you have such animosity?
I for one know C, Perl, PHP (not just Wordpress either) and even Coldfusion. I'm currently learning Python because I want to.
And I chose to work in Javascript as my main language because I want to. I have no real issues with scoping, am comfortable with the fact it has no 'class', have a good handle on the fact that everything (including functions) is an object.
It is flawed? Yes. But then again there isn't a single language in existence that isn't.
is the "developers" dig really necessary? MongoDB is fine-- at least until you start seeing significant scale, and even then, it's going to run into the same issues as everything else would if you hit it too hard, too often.
Just because the query language is JSON-esq and has a javascript shell, it's therefore "Javascript" and a target of derision and scorn?
What is stopping any of these kids from, I don't know, just staying home like a normal person and working on the game in their spare time as a hobby? Nothing indicates any of these children are homeless, fosters, or abused in any way, so one has to assume they're (presumably) regular kids.
Are we (and by we, I partly just mean the tech community, or whatever you wish to call it any more) seriously so introverted as a society any more that we have to encourage 13-year-old boys to sit with their headphones on, near-silent in a house with a bunch of developers potentially ten years their senior instead of going out and acting like a normal person?
There's nothing wrong with wanting to develop something, but come on. Why can't people simply engage like regular human beings any more?
We invite cool guest speakers and make sure they all interact with each other. The best games aren't built in isolation, feedback from peers is extremely important.
I couldn't disagree more. It's very difficult for younger and inexperienced people to know where to start and learn something as difficult as making a game. I know adults who don't even know where to get started with web development.
In fact I would say the opposite is true. Kids should go out and enjoy a collaborate learning environment like this, instead of being isolated in their homes trying to figure things out on their own.
I have two 12 year old cousins who are very interested to get into video game development. They have some great ideas and are thinking of ways to make games like Minecraft more enjoyable at a base level as well as new gaming modes and/or features. Not only that, but they are thinking about monetization and how and why people would be interested in this. Not only that, they aren't just thinking about Minecraft either. They have ideas for their own games, and games they haven't played yet (not old enough by the ESRB and their parents' standards).
On top of this, they play sports. Outside. Not the EA type. It's only after their homework is done and they don't have any sports activities or friends over playing a board game (remember those?) that they are allowed to play video games. These kids are smart and enjoy the outdoors.
I would definitely say that they are regular kids and extremely engaged as a "regular" human being (whatever that means). At the last family BBQ, when they found out that myself and two friends are making our own video game, let's just say I was bombarded with questions about how to start and what to do and learn ;)
Their dad was also especially happy to hear that there are free versions of Blender, Construct2, and Unity3D.
It's going to be interesting to see whether or not the development market collapses in the next couple of years when SF is finally either out of ideas or full on people (if it arguably isn't already), and the rest of the country can't keep in step because most other cities' markets are clamoring just as badly (if not worse) for the seniors. The issue, though, is that it's a catch-22 right now. Companies in smaller cities want seniors so they can get work done and not have to train anyone, but the seniors generally don't want to work in places that aren't SF/NY.
Someone has to budge, unless the market is to die completely. I can't mention how many times I've seen companies posting for the same position for upwards of a year instead of simply taking a chance on someone. Sometimes it doesn't work out, but that's business. Don't run one if you can't take the risks.
So true. As a senior-level who has zero interest in SV or NYC, my biggest frustration is with companies who demand 5+ years of professional experience in specific skillsets, essentially a drop-in piece for what they think they need right now. This attitude extends hiring, or in this case not hiring, juniors as well, it seems.
Its one of many frustrations driving me back towards electrical engineering, though EE employers are even worse about this one particular point.
Companies in smaller cities want seniors so they can get work done and not have to train anyone, but the seniors generally don't want to work in places that aren't SF/NY.
Disagree strongly. In fact, I'm moving to Baltimore (from NYC) for a few years (probably) and really looking forward to it. There are a lot of smart people there-- a lot of different kinds of smart people, unlike in, say, the Valley where there's one kind of smart people and that's programmers-- and DC is only an hour away. I might end up in California eventually, but I'm seeing a lot of interest in the most talented people in getting away from the legacy-laden "star cities". It's not quite an "exodus", but I hear more conversations about Austin than San Francisco around NYC. Six or seven years ago, Austin was barely on the map; even New York was the hinterlands except for Wall Street. Now, the general sense is that the Bay Area is for older people who were able to get in and buy a place at a reasonable price, before it got all fucked up.
Makers like new places and open opportunity, as well as freedom from established hierarchies. That means they'll always be moving around from one generation to the next. To tell the truth, though, macrolocation (California vs. Texas; Northeast vs. South) seems to matter a lot less over time, and microlocation (cities vs. suburbs, proximity of cafes and bike paths) matters more. I think that trend's continuing, thanks to the Internet. 20 years ago, or even 10, being an unusual person (3-sigma intelligence; gay; artistic inclination; minority religion or, in many communities, no religion) in a B city meant social isolation. In 2013, it really doesn't; you can find your tribe even if you are, say, an atheist in the South.
Someone has to budge
I think that the next 15 years of talented young people are going to be more dispersed than the last 15. That means there will be fewer superhubs and that's a good thing. It does, however, fragment the labor market, which means that the volatile culture-- the two-sided itchy trigger finger dynamic-- of promiscuous job hopping and fast firing will have to go. Companies will also be more willing to invest in talent. They'll have to be that way; the extremely liquid talent market of the Bay Area now won't exist (anywhere) in 10 years.
What makes you feel that the next 15 years of talented young are going to be more dispersed, have you seen any indications of it? Is the idea just that the bubble will burst, scattering everyone throughout the currently underserved market?
I'd also love to hear why you think there will be fewer superhubs (and why that's good). A lot of my reading lately has suggested the opposite - growing superhubs, and why that is a good thing.
What is it you like about Baltimore? I haven't heard of it having many technology jobs but it is an interesting place. I live near Baltimore now. Drop me an email if you have any questions.
The icon in the article looks fine and the huge irony about iOS7 is that non-Apple graphic designers voicing their disagreement have generally been nothing but wrong.
Minor anecdote, but I'm overseas on vacation and was in a restaurant when one of the waiters/runners almost freaked out when he noticed that I was running the beta. He was completely enamored with it. Everyday, average people absolutely love the design and that's what's going to be what matters in the end.
"But along with the cash, Watsi has also raised eyebrows. To some critics, there’s something distinctly neocolonialist and off-putting about the spectacle of well-off do-gooders in the U.S. choosing which brown people live and die in the developing world based on who has a cuter picture on Watsi. Others wonder whether focusing donations on individuals, no matter how worthy, diverts funding and attention from efforts aimed at tackling the more systemic causes of inadequate healthcare in impoverished parts of the world. Watsi must also bear the misfortune of coming of age during a simmering backlash against Silicon Valley. We’ve gotten tired of hearing the name brand Silicon Valley bigwigs who have invested hundreds of thousands of dollars in Watsi talk about the merits of “disrupting” existing industries, when all that really seems to end up happening is that a few people get rich while the competitive screws tighten on the many."
You have an urge to save a life? Make a direct donation yourself or to one of the hundreds of charities that exist. Some charities are certainly better than others, but the options are out there. Watsi is a pathetic excuse for a "company" where this sort of pick-and-choose comes across as a form of good when it's really quite creepy and saddening.
Startups don't need to exist in order to do good, and you don't need to hide behind the veil of backing a company in order to seem like a good person. Give someone the money yourself or go visit those countries if you really give a shit, because there's more to their problems than individuals who happen to need medical care. Most of them don't have any at all.