Hacker Newsnew | past | comments | ask | show | jobs | submit | anhner's commentslogin

when it takes 10 seconds to do anything on Jira, it's not hard to see why people want alternatives

Except that is not the reason, and that’s not new haha

updog? what's updog?

It's an uptime service from DataDog, and enterprise event/log/siem/monitoring/apm company, like Splunk. So what they do is watch uptime stuff for your favorite large business.

Oh no, what if they put on Christmas music playlist in February? the horror!

There should exist something between "don't allow anything without unlocking phone first" and "leave the phone unlocked for anyone to access", like "allow certain voice commands to be available to anyone even with phone locked"


Playing music doesn’t require unlocking though, at least not from the Music app. If YouTube requires an unlock that’s actually a setting YouTube sets in their SiriKit configuration.

For reading messages, IIRC it depends on whether you have text notification previews enabled on the lock screen (they don’t document this anywhere that I can see.) The logic is that if you block people from seeing your texts from the lock screen without unlocking your device, Siri should be blocked from reading them too.

Edit: Nope, you’re right. I just enabled notification previews for Messages on the lock screen and Siri still requires an unlock. That’s a bug. One of many, many, many Siri bugs that just sort of pile up over time.


Can it not recognize my voice? I had to record the pronunciation of 100 words when I setup my new iPhone - isn’t there a voice signature pattern that could be the key to unlock?

It certainly should have been a feature up until now. However, I think at this point anyone can clone your voice and bypass it.

But as a user I want to be able to give it permission to run selected commands even with the phone locked. Like I don't care if someone searches google for something or puts a song via spotify. If I don't hide notifications when locked, what does it matter that someone who has my phone reads them or listens to them?


Personal Voice learns to synthesize your voice, not to identify it.

But you understand why if I don't care about that, I should be able to run it, right?

you can, you can turn locking off.

But the point is, you are a power user, who has some understanding of the risk. You know that if your phone is stolen and it has any cards stored on them, they can be easily transferred to another phone and drained. Because your bank will send a confirmation code, and its still authorized, you will be held liable for that fraud.

THe "man in the street" does not know that, and needs some level of decent safe defaults to avoid such fraud.


I understand why you'd want to do it.

Oddly enough I also understand Apple telling you, good luck, find someones platform that will allow that, that's not us.


you should look into using subagents, which each have their own context window and don't pollute the main one

and yet, there are obviously some companies that are better than others.


It offers a GUI for easier configuration and management of models, and it allows you to store/load models as .gguf something ollama doesn't do (it stores the models across multiple files - and yes, I know you can load a .gguf in ollama but it still makes a copy in its weird format so now I need to either have a duplicate on my drive or delete my original .gguf)


Thanks for the insights. I'm not familiar with .gguf. What's the advantage of that format?


.gguf is the native format of llama.cpp and is widely used for quantized models (models with reduced float accuracy to reduce memory requirements).

llama.cpp is the actual engine running the llms, ollama is a wrapper around it.


> llama.cpp is the actual engine running the llms, ollama is a wrapper around it.

How far did they get with their own inference engine? I seem to recall for the launch of Gemma (or some other model), they also launched their own Golang backend (I think), but never heard anything more about it. I'm guessing they'll always use llama.cpp for anything before that, but did they continue iterating on their own backend and how is it today?


> properties that no natural fabric can offer

like polluting every inch of the Earth with microplastics!


> We have clothes and materials like gortex now that blocks rain and snow no handmade jacket could ever hope to perform at the same level to be lightweight AND dry.

At the cost of massive environmental, animal and human health.


Sure, let's all ditch linux and macOS as well since they're not the most popular...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: