I guess by "standards pertaining to the web" Apple means HTML, Javascript (ECMAScript to be precise) and CSS.
What do you mean by "standards pertaining to what software is allowed to run on your computer"?
Packaging systems and their associated programming languages / APIs. Like Android .apk files, Apple .dmg files, Microsoft's .msi. All open and usable by anyone who wants to build one (though some fees may apply.)
Less of an explicit standard and more the previous consensus that the company that sells you your computer shouldn't retain control over what kind of legitimate software you can run on it after it's in your possession, based on some self-aggrandizing notion that they know what's best for you better than you do.
That's exactly the mentality that keeps 'users' stupid.
Apple is basically saying: "users are stupid, let's protect them from thinking." Imagine if we said that about our kids: "Our kids don't know how to handle the real world. Let's protect them from having to deal with it." (never mind that most parents do in fact go through this phase).
The reality is that users are just kids who haven't learned how to use computers. Two factors make this hard: most computer software is generally poorly designed (speaking in terms of the number of poorly designed products vs the number of well-designed products), and most software is not designed to teach users how to use it.
Apple is busy buying fish for starving people. The company that teaches users how to fish is the company that will win big.
That's not about keeping users stupid, it's about not making them worry about stuff they should not worry about.
The user may be the world's best neurosurgeon, does that mean that they have to learn about filesystems?
most computer software is generally poorly designed
You are right about this one. But the thing is that Apple does exactly that: offering well designed software.
It's not about buying fish, it's about hiding unneeded complexity. How do you drive the car: press the gas and it goes, press the brakes and it stops. Turn the wheel to the right and it turns to the right. You need zero knowledge about what's going under the hood.
Now take the iPad: tap an app and it launches, press the home button and it stops. Swipe to the right, swipe to the left…
"The user may be the world's best neurosurgeon, does that mean that they have to learn about filesystems?"
There's a bit of a divide as to what 'personal' computers are being used for. The original mainstream use was to create/edit files using programs. The modern use is to interact with other people via the internet. So no, the neurosurgeon doesn't have to learn about filesystems because filesystems are mostly becoming irrelevant.
I don't think software should be designed to educate people about the trappings of decades of computer cruft. I think software should be designed so that using the software teaches the user how to use the software. Apple's method (re: iPhone/iPad) seems to be to design software that doesn't invite learning, and at the additional cost of limiting functionality.
Apple's method (re: iPhone/iPad) seems to be to design software that doesn't invite learning
Could you clarify this, perhaps by comparison to something else? Everything in my experience strongly supports the notion that Apple's general approach leads to much greater levels of competency and independence than anything else out there right now.
Turn the wheel and it goes where you want it to. But if you want to go to Google Voice, you'll have to take a long detour few people know of. You see, the road to Google Voice hasn't been optimized for your comfort. Or Apple's profit.