Although the language i have been working on includes time travel as a built-in feature, i find that i almost never use it, because in a deductive language, it is not that easy to create a bug.
Time travel is great for understanding someone else's large program, so that you can see where it goes as it is running. In large projects the instruction pointer is hop around madly throughout the code, and seeing where it goes like the path of a bird is rather handy.
Most of his predictions are nonsense, and show a lack of understanding of the logical foundations of modern computing. His assertion that instruction sets are "obsolete" is absurd. The Turing/Von Neumann computer uses an instruction set and RAM to do computing. He is proposing some nebulous jelly computer that would be unprogrammable.
The idea that ordinary people can write their own OS is laughable. There are several million lines of code in every OS related to just drawing and responding to a text entry field. There aren't even 100 people who know the TrueType language which underlies all font rendering. The last small team generated OS of any accomplishment was the Oberon project at ETH, and that was decades ago, before the web, and his team of very smart people were not ordinary.
Hardware has to evolve with software. Intel cross-point memory tech, which was marketed under the Optane name, failed because people didn't know how to use it properly. New hardware will easily be stunted if it can't get traction. Look at the failure of the Adapteva Epiphany chip. It was a 10X improvement in computation per watt, but nobody knows how to program/debug a chip with thousands of independent cores.
The underlying mathematics that we all use originates from linear, sequential, Greek Proof, and we have no parallel logic math developed yet.
The Multics operating system, which was funded by the DARPA at MIT's Project Mac for 10 years, was superior to Unix from a technical standpoint, and it used segmentation.
Unix has Paging but not segmentation. Segmentation allows you to grow a memory block without having to copy as it grows, something that just plain paging cannot do.
no, there is a typo in the video. The SDK is the best authority for Beads examples, as those are checked on each build for any errors. We have about 20 sample programs, some large enough to show most of the language features.
Excel achieves its robustness (and it hardly ever crashes) by protecting the user against arithmetic errors, and circular reasoning. Excel doesn't offer infinite precision arithmetic, that is rarely needed in practice.
Null pointers are something that plagues Java, which are primarily caused by having some computation out of sequence. By using the State-Action-Model pattern, most of Beads code will be devoted to rendering the model on the screen, and that one-way transfer of information can be made very robust, provided you have a closed arithmetic, which it does.
Meteor has a lot of users, and is still around. The problem with Meteor is that it uses JS, and that language has a lot of problems. Typescript has clobbered meteor, as people have realized that JS makes it far too easy to make a typographical error that crashes at runtime.
JS is still ahead of TS in usage according to the Tiobe Index and Github language commits. As far as typographical errors go, every already knew JS was a dynamic language, like Python and Ruby. I'd say other frameworks like React and Angular, or their predecessors out-competed Meteor.
if i can find a better cross platform delivery mechanism than AIR that isn't too far away from JS, i will switch, as there are a fair number of Linux-based programmers out there. Sorry for the current inconvenience of requiring Wine. But it does run okay according to some of my users.
The problem with both Swift and Kotlin, is that you must use libraries to draw and handle events, which typically makes it tied to a single OS (Swift only really being used in OSX at present). Kotlin might rely on the very extensive JVM libraries, which are horrendously complicated.
I don't know what you mean by a step backwards. This a batteries-included environment like VB6 and Borland Delphi, but emits to the current web app universe we live in, so it is definitely in the now.
Time travel is great for understanding someone else's large program, so that you can see where it goes as it is running. In large projects the instruction pointer is hop around madly throughout the code, and seeing where it goes like the path of a bird is rather handy.