Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is absolutely a problem, but I'm not convinced the author has found a solution. I'd have to try it to know for sure, but from the description, it still sounds finicky.

I believe touch screens are fundamentally a bad interface for productivity. Consider the range of actions provided by a mouse: You can hover without clicking, you can left click, or you can right click, all with nearly pixel-level precision. Add in a keyboard and your options expand even further.

A smartphone is like a computer with a one-button mouse and an abnormally large, irregularly shaped cursor, where can never be sure which part of the cursor indicates your actual click target. Software on this computer is not aware of the cursor's location until after the mouse has been clicked, and portions of the screen are blacked out when you move the mouse to certain positions. Your keyboard only works when you bring up an on-screen overlay which takes up ~35% of your screen real-estate, on a monitor which is abnormally small to begin with.

Could any amount of well-designed software make text entry efficient on this machine?

This is a hardware problem, not a software problem.



Why not something akin to "fixed-offset cursor"? Actual cursor/pointer is always a centimeter above the finger position (thus always visible [bottom of the screen to be handled as a special case]), and finger movements manipulate position, while hold duration accesses alternate modes.

For a somewhat similar implementation (touchscreen is essentially handled as touchpad) look at how Teamviewer handles remote sessions to Windows desktop from smartphones.

I would really like any/all of those as a toggle-on mode for use on Android itself... selecting text with arthritis fingers is a pain – literally and figuratively.


Try out the app Quick Cursor[1] it does a lot of what you describe. Pretty intuitive too, you build muscle memory for it quickly. 1: https://play.google.com/store/apps/details?id=com.quickcurso...


"A smartphone is like a computer with a one-button mouse "

Not quite. A touchscreen can detect many fingers, that can do many gestures.

The problem is, exept zooming (pinching with 2 fingers) and moving around (swiping with 2 fingers) the potential is pretty much unused.


I have trouble imagining that more than 2-3 fingers can be realistically used for input at once. For one, you normally have to use one hand to hold the phone, which uses up 3-4 fingers. And to enable frequent multi-touch gestures you'd have to hover your hand awkwardly over the phone, which is fine for the occasional pinch-to-zoom but not if you have to do it all the time.


But with 2-3 fingers you already have lots of combinations. 2 thumbs are enough. I have been toying with some ideas for a game and thought about the potential for normal use cases.

Intuitive (like swiping and pinching) is none of it though and it can only be useful once muscle memory kicks in. So I guess this is the reason, we do not have a global standard for anything advanced multitouch.


Given 2 finger pinch-spread is already taken for zoomout-zoomin, what are the remaining useful combinations for 3 fingers? It can't detect which finger you're using. It should support one-handed operation, so the 3 finger gesture should be possible with one hand. You'll likely have to use middle and ring fingers, which are mechanically connected, so relative motion would be difficult.

I think you're wrong. There are very few combinations available for things like text editing.


"You'll likely have to use middle and ring fingers"

Well, people who played the violin or alike can do this, but yeah, it was hard work to learn and cannot be expected of the general user.

So yes, it cannot be anything complicated.

One of my ideas is to use the position of the fingers and the timing they touch. No matter which finger, what matters is the position they touch down relative to each other. So 2 (or 3) touches, like double click, but with 2 fingers and you evaluate the positions.

Simple variant: "pointing finger", "middle finger" -> one action (one touching point, then another touching point shortly afterwards left of the first touching point)

"middle finger", "pointing finger" -> another action (first touching point, then the other touching point to the left of the first one)

Or you can use the 2 thumbs, or thumb and pointing finger. And use up and down as well. So you have "middle" "up"; "middle" "down"; ...

And the actions could be anything, like "middle" "left" could select the next word to the left or go to the beginning of the line.

But my main thinking evolved around game actions, but it is the same principle.

And then you could also advance it with "middle" "left" "middle", the only limit are the coordination skills of your average user.

And ... the hardware limits. I have never really implemented those ideas, because I got frustrated very quickly(some years ago) with how the native touch handling of the OS got in the way, and some unprecise sensor input. But if you keep it simple, it should be doable. I guess I am now motivated to give this experimentation another shot real soon ..


I like playing music and enjoyed the multitouch puzzle game posted on HN a while back, so for a game like the one you're suggesting, fair enough.

For text editing, I don't think the cardinality of the set of gesture-based solutions is very large.


It's a good point, but I'm not sure how to use those gestures for text entry. TFA didn't go down the multi-touch route either in their proposal. But it's possible I'm not being creative enough!


If you forced me to implement something without doing any research, I’d suggest:

First, (iOS perspective but it’s not that different than Android) to convert the suggestions row to a row of function keys anytime Shift was held down. No annoying delays or extra taps waiting for the menu pop-up. So now you can use two fingers to cut copy paste undo instead of using several seconds and a popup that can appear anywhere onscreen. Next, arrow keys. Heck, use the same Shift hack and, when Shifted, replace the space bar and the dead useless area below it with a big inverted T. Our phones have gotten much bigger and yet especially on iOS the keyboard hasn’t grown a pixel since what, iOS 5? Case in point: no number row still for Apple, even as an option. Wtf.


iOS already has three-finger pinch to copy/paste, and three-finger swipe to undo/redo. But I find that these gestures are too unreliable for regular use, the gestures almost always activate some other click target plus my other fingers are busy holding the phone.


>exept zooming (pinching with 2 fingers)

2 fingers is already too much. Maybe I could've done it with 2010 smartphones, but new ones are so large that I switched to double-tap then drag ages ago. It's annoying having to involve a second hand just to zoom something, and trying to do it with a single hand is just a recipe for destroying your phone.


3D touch solves some problems for touch interfaces, but sadly Apple killed it (everywhere except Magic Trackpad, for some reason).


It never worked though. It would just pop in randomly at unexpected times. It sucked.


Hmm. It worked fabulously for me, and spared me from waiting a beat every. Single. Time. that I want to get what we now know as a long-press menu.


The problem was they made deep press and long press do different things, so it chose the wrong one regularly. I don't know why they ever thought that was a good idea. It was obvious that deep press should have just been "faster long press".


How did you get it to work on bumpy car rides, or while jogging?


There's your problem. I never exercise (JK).

But seriously, I suppose I just never use enough pressure when I'm "just" regular-tapping to inadvertently activate force touch under any circumstances, so I never had any ambiguous inputs.


I LOVED it. I could move the text cursor with precision AND do text selection with the same precision and FAST. In some situations faster and with more precision than I would have been able to do with a mouse even.

Such a great loss that they dumped it.


cursor move is still available with a long press of the spacebar, but I dearly miss the selection. that was the killer application of 3d touch I think.


The first gen or so of Android phones had an optical 'trackpad' below the screen, some were terrible but some were really good, allowing far more precise cursor movement than a touchscreen. I wish this feature had survived, it was awesome for text editing.


It did in a fashion, on my Fold 5, for instance, I can half fold it and get a trackpad on the bottom half of the screen, which is great! I wish that was just a general toggle between "direct touch" and "screen as trackpad with a cursor" that'd be just grand.


Some had a trackball which was fairly precise.


well yeah, a touchscreen is never going to be a keyboard, but that doesn't mean things couldn't be better


> a touchscreen is never going to be a keyboard, but that doesn't mean things couldn't be better

... they growled, struggling to pierce the sow's earlobe.


> This is a hardware problem, not a software problem.

This is a UI validation problem.

Apple proved in 2007 that you can port tons of applications over to the smartphone. They had to invent their own language for interactions though ("pinch-to-zoom" etc) and it took them two weeks of focus with all their software development staff involved to fix keyboards on capacitive touch.

It may not be possible to reach the same kind of flexibility on a mobile device when it comes to rich text editing, but it's certainly possible to port over a lot of functionality from the desktop.


Do you really think Apple invented pinch-to-zoom? CMU Sensor Lab had it back in ’85. Steve Jobs coincidentally visited soon after and later claimed to have patented the technology for the iPhone. That was shown not to be true in the big Apple vs. Samsung patent case.

References available upon request.


Not to imply any specific doubts—I’m aware of the narratives—but if you have references, I’m interested! Request, request.



No, I don't think they invented multitouch, I was referring to the name of the gesture. "Pinch to zoom" is afaik their invention, as opposed to "two or more input points applied to the touch-sensitive display that are interpreted as the gesture operation", as the article mentions.

The issue is about naming and communicating UI in an intuitive manner. Which should be solved by acceptance testing.


As far as clicking and basic functions, is a two button mouse really that different from tapping + short holding to open a menu wheel? There's a lot of pain points to me when comparing a touch screen to a PC setup but the mouse isn't one of them.


I agree, but note that a lot of this is solved by having a pen with buttons.


I agree, styluses are great! Placing the cursor and highlighting text on e.g. a Nintendo DS is a lot easier than on an iPhone. (Everything else about typing on the DS sucked of course, but it's not like Nintendo put substantial effort into that experience.)

I do find that e.g. the Apple Pencil doesn't have a small enough tip for text selection to work well, it's really made for drawing.


(just to add, the Latin plural of or "stylus" is "stylī", so in English we can use "styli" or "styluses")




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: