Hacker Newsnew | past | comments | ask | show | jobs | submit | EvenThisAcronym's commentslogin

I think the cognitive function model is fine and useful, which is why Dr. John Beebe's model is so appealing.

My main issue with Meyers-Briggs is the use of a questionnaire to type people - I'm an ESFP, but every time I've taken an MBTI test I've gotten back INTJ or ISTP. I never felt that either of those were completely accurate, or that I was an introvert. It's far, far more accurate to type people based on their preferred styles of interaction and thinking (e.g., as an ESFP I have an informative, initiating, movement-based interaction style and I think in concrete, pragmatic, interest-based terms).


Meyers-Briggs is absolute trash and should not be used by companies for anything. Big 5 is also almost completely worthless beyond vaguely describing a person's general tendencies.

The only system that seems to have any sort of validity and real-world use is Dr. John Beebe's Eight-Function Model.


> The problem with all-chat in League of Legends, particularly in the ranked queue, is that it can only hurt your team's chances to win. At best, you're wasting time typing; at worst, you're actively frustrating and distracting yourself from what is already a very difficult game.

I don't agree; /all is a very viable vector for giving your team a huge leg up. If you are skilled at talking shit, you can usually tilt someone on the other team so much that they become a non-factor. So many times I've seen the enemy team turn on each-other because one of their laners made some small mistake, which I capitalized on as a chance to relentlessly attack them in /all chat. They become tilted and start playing poorly, their team starts attacking them as well for the poor play, and often they will either troll, constantly call for a surrender, or even AFK in the fountain. The mental aspect of the game is just another tool in your arsenal for gaining an advantage over the enemy, and in my opinion it's just as important as the mechanical and strategic aspects.


> As just one anecdote, the mere presence of a GC made me stop looking at D more or less immediately.

That's a shame, because it is extremely easy to write GC-free code in D. One very important fact is that if you don't allocate using the GC (using "new", appending to a slice, etc.), *the GC will never run*. D provides you a ton of tools to make sure that you're not doing GC allocations, and if you are, where in your code they are happening (@nogc, -vgc and even -betterC).


> But it also seems like D is not suitable for a lot of things that C++ is, like embedded applications

Not sure where you got this idea, as being usable for embedded programming is explicitly one of D's design goals. You can use D anywhere you use C++, and it is a much nicer language to boot!

> Also, it doesn't seem like support for Android and iOS is very good at the moment

I can't comment on this. There is support for both platforms IIRC, but I haven't tried writing apps for either in D.

> But I'm hesitant to build anything with it because as much as I loathe working in C++, it's still the safest bet by far.

I'm not sure what you mean by "safe" here. D has been around for over 20 years and isn't going anywhere anytime soon. The main difference from C++ is that the community is smaller, and it's a bit harder to get a job writing D (but not that much harder nowadays; there are a lot of people working for companies like Symmetry Investments, writing D every day).

> Also, Swift seems like it has similar goals, but with Apple's full support behind it.

Swift also seems like a very nice language, and I would probably be using it if D didn't exist. That being said, its metaprogramming capabilities are woefully underpowered to D's. From my experience, only the Lisp family of languages beat D in this area.


> Rust has Non-Lexical-Lifetimes (and soon Polonius). D's documentation is very sparse about how lifetimes in @live will be determined, but I think it's safe to assume that it will be scope-based at first.

@live also has non-lexical lifetimes: https://forum.dlang.org/post/r742l6$2ofo$1@digitalmars.com


> I couldn't sleep much (sometimes 4 hours max) Did you find some way to mitigate this or did you just have to live with it during the fast?


I lived with it.

At the beginning I was really worried, but as I said I was very alert despite lack of sleep/food and I was able to work etc.

My personal take from this experiment, was that we don't need that much food/sleep to thrive, but again that was my personal experience, and I wish there was more research on that instead of the constant mainstream message (sleep 8 hours and eat 5 meals etc.)


> Exercising might reduce the loss of muscle mass to some extent

That does indeed seem to be the case, at least for a 16:8 intermittent fasting regimen (an 8 hour eating window with 16 hours of fasting): https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5064803/

"Our results suggest that an intermittent fasting program in which all calories are consumed in an 8-h window each day, in conjunction with resistance training, could improve some health-related biomarkers, decrease fat mass, and maintain muscle mass in resistance-trained males."


Then use garbage collection; it's an easy default.


>If you need typed functional programming, your options are OCaml, Scala, F# or even subsets of Rust and TypeScript.

And don't forget D. It's surprisingly well-suited for FP as its effects system has explicit support for pure functions and specifying a function is guaranteed not to throw, in addition to support for true (transitive) immutability. Not to mention full first-class function support, with lexical closures.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: