Hacker News new | comments | show | ask | jobs | submitlogin
Siri, What Time Is It in London? (daringfireball.net)
728 points by jmsflknr 13 days ago | hide | past | web | 748 comments | favorite

At some point about a year ago I noticed that I could no longer ask my Google Home devices "what's the weather?". I'd just get a generic "I don't understand" response. But more specific queries such as "What's the weather in Seattle?" would work.

After a couple of weeks of this, I somehow got the idea that it was related to the devices' configured locations. And sure enough, telling the Home that I lived in the next city over fixed the problem.

So I started a binary search and eventually found that the issue was limited to my ~10x20 block Seattle neighborhood - basically the outline shown when I search for its name in Google Maps. I then also realized that it applied to weather queries on my phone as well, but since the phone uses GPS rather than a specific location setting, I could only reproduce the broken and working behaviors by crossing one of the neighborhood boundary streets.

Turns out it was some long-standing configuration issue with Knowledge Graph's entry for my neighborhood, and some recent code change in location-based weather queries began butting heads with it. Luckily I worked at Google at the time and was able to track down and pester people that could help fix the issue.

But if you had not worked at Google at the time, there’s basically zero chance you could’ve got anyone at the company to do anything about it.

Reminds me of when I worked at Yahoo, for not very long in the early 2000s. Yahoo had a consumer product called My Yahoo, a customisable home page with various feeds in it, and at the time they also had a My Yahoo Enterprise Edition, which they tried to sell to companies to serve as an intranet home page. I worked (in a junior sort of way) on the European deployments of MYEE.

Anyway, one of our customers - representing a company in Germany I think - filed a bug report that said something like "Weather module hasn't updated since January". They'd been going to their fancy intranet home page and seeing the same weather for months at a time.

And this bug report just sat there. For a mixture of technical and political reasons, there seemed to be nobody in the European office able to pick up this report and do anything meaningful with it. We knew about it, we knew that what we were serving to paying customers was hopeless, but we somehow couldn't get our hooks into the right point in the Weather feed to figure out where it was going wrong. Or, collectively, we didn't care enough.

You could replace Yahoo with any other company and this would be equally true.

If this had happened at some of the other places I have worked, the CEO would have made all the developers stay in the office until the problem was fixed. Why did no one care?

To say no one cared is going too far, but it is interesting to think about why something like this was able to happen.

There were various technical and structural factors making it difficult to fix. Weather feeds were known to be problematic (still are I guess) and this code would have been surprisingly low-level C/C++ with custom serialisations and limited logging. Structurally there were problems in getting attention from a team in California to support a problem experienced by a different team in London, especially since it affected relatively few users - a tension in supporting paid products in a company that is focused on non-paying users at far greater scale.

(I am assuming this bug would have needed some actual development work - I don't recall, but I think we were familiar enough with common ops problems that it wasn't just a question of kicking one of our own feed servers.)

But I do think there was an issue about lack of concern - at heart we didn't have enough confidence in our own product to motivate the personal pain of working through these problems and getting them solved. I think that, if you had gathered us together and asked our collective opinion, we would have suggested that this customer would be better off not using our product at all - it simply wasn't ever likely to be good enough. Once you reach that way of thinking about your own product, it becomes extremely hard to countenance fixing the most difficult problems with it.

That sounds like pretty bad places to work. And very bad internal processes.

Don't have to name them, but what kind of places are these that care? Big, small, tech, non-tech? Locations?

Well, for me, it was finance where you know each and everyone of your clients by name because each of them is paying millions of dollars a year for your product. The company had around 50 employees and a 200m valuation.

My address has been blatantly broken in Google Maps for years now. When you enter it, for some reason it deletes the house number and just looks up the street, which ends up pointing to somewhere about a 10 minute drive away. For example, if my address was "123 Elm Street" it just ignores the "123" and searches for "Elm Street" instead. We have to give special instructions all the time to delivery drivers and other people we give our address to, to warn them and make sure they don't end up going to the wrong place if they use Maps.

I've sent feedback and error reports about this repeatedly, and even had a friend that knows someone that works on Maps pass it on to them directly. It's never been fixed, and I've basically just given up on it at this point. It's really shown me how impossible it is to get any kind of support from Google for even an extremely obvious, straightforward issue.

My address has been blatantly broken in Google Maps for years now

I recently moved, and found out that virtually every single web site from my credit cards to my bank to the library uses Google to verify address entry on the fly. The problem is that Google's database entry for my address is wrong. So any time I try to enter the address "123 Oak Street, Apartment Q" Google unhelpfully corrects it on the fly to "Oak Street, Suite 1." No amount of keyboard jockeying can override Google's on-the-fly autocorrection.

The solution I eventually came up with was to turn off javascript, then enter my correct address, then turn javascript back on to finish the rest of the form.

Of course, there's no way to contact Google about its error. Maybe in Google Maps? I dunno. How do you find an address that Google Maps doesn't know to tell it that the address it has is wrong?

I had issues with google maps being broken for the address when I moved in. I reported it too many times to count using the website.

In the end what worked for me was registering as google maps client/customer, reproducing the issue via API, and then reporting it as an API issue. The underlaying data was fixed within a day or two, and I got my emails answered by google engineer within (literally) minutes.

Ironically your best bet to get it fixed is almost certainly to post your real address here as part of this thread.

Or post another address with the same problem, assuming it affects the entire street.

It doesn't. Even the address for the other half of the duplex (my address + 2) works fine. Sometimes we just provide that address instead and catch the delivery people when they arrive, because it's easier than worrying about the Maps issue.

I had an issue with my own address on google maps (not as severe as this, it just had the wrong zipcode). After trying to correct it using the tools built in to maps 3 times and having my correction rejected every time, I tweeted my complaint @googlemaps, did a quick DM back and forth and it was fixed shortly thereafter.

(please do not take this as an endorsement of google maps support, merely an anecdote of what did work for me that I hope might help you)

As a former Uber/Lyft driver, I probably ran into some kind of map error at least once every 8 hours of driving. These databases have a lot more garbage in them than I realized when I wasn't driving as much.

Same here. Delivery people who have never been to my house consistently go to my neighbor's house a mile away because that's where Google sends them. What's so frustrating about this is that both houses have very clear address markers on the street. But people now trust Google more than they trust actual address signs IRL.

Curious if the address is correct in Apple Maps?

With Apple, I've submitted perhaps 5 corrections for 5 different (usually minor) problems in 5 years. Problems like a place claims to take Apple Pay when it doesn't. Or the actual place is across the street from where Maps claims it is.

In each case, Apple sends back a notification within 2-3 weeks saying they've fixed the problem, and when I've checked, it has always been resolved. Pretty happy with the service.

I would suggest navigating to your home in maps and end up on the wrong address. When it asks for your feedback (nowadays it usually asks for it based on where you live), give it the lowest rating (the frowning smiley). In some regions, it will take you to another page asking you to write in detail what happened. Mention your story in there, hopefully, someone looking at bug reports will read through it.

Well, this can be a feature rather than a bug if you like privacy. I knew a house that was on a large, mostly wooded piece of land where Google Maps would direct you to the wrong side of the property via a dead-end street in a development. If you went there, nothing but trees for a thousand feet or maybe two. Probably at least a 10 minute drive if you actually knew the street that the driveway connected to.

A couple years ago, I sent a report to google maps about a small museum in a remote corner of Norway. The map had the location of this museum about a hundred meters off. Not long thereafter, I received a thank you note saying that indeed, it was wrong, but now they fixed it. And indeed they had. I suppose I should have been surprised at this success. Or maybe they were more responsive before.

Huh, I've forgotten that I have the same problem with my address. If you look up my address, Maps will point to my house (let's label it X St. 36), but the street segment in front of my house has the name of the parallel street (Y St.) a block south of it. On Google Maps, X St. incorrectly changes to Y St. 100 or so meters before reaching my house...

As annoying as that sounds for a home address, imagine it being a business address. It would cost that business a lot of money.

Yes. In this instance you're probably correct.

Write a blog and post it to HN, hopefully you'll gain traction

What if you asked Google "what's the weather?" and it gave you a dictionary response of "weather" as an answer? Or with more snark, responded with, "I don't know. Look outside".

"What is the weather report for today/this week?" is a more accurate question, despite an annoying amount of verbosity. But answers are still given relatively. "Cloudy" could be an accurate answer, for now. But it will be "Sunny" this afternoon.

Some people will prefer a one word answer to "What's the weather?". Others, will want an hourly breakdown of the day displayed on their screen. Others, might prefer a week. It's hard to give an ideal response for every situation.

Ironically, if I say "Hey Siri, weather" I usually get what I need.

> What if you asked Google "what's the weather?" and ... with more snark, responded with, "I don't know. Look outside".

Now that you mention it, I find it strange that voice assistants don't give natural responses when they encounter an error. It would make them seem more real, and it would be less frustrating.

When you ask a human what the weather is and they tell you to look outside, you don't try to rephrase the question in a way that will make them give you the right answer, you just realize this is not the way to get an answer and you look for another way.

Maybe this isn't the best way for it to work in this case, since there is a different thing you can ask to get the response you want, but maybe this would make interactions better if, say, the phone can't detect your location. "Where did you want the weather for? I think I'm lost."

Of course, the other problem is that if it gives the same response every time it'll get grating. The hundredth time you hear "Look outside," it has probably lost its charm. I wonder how possible it would be to generate responses that take into account all previous conversations, so that this doesn't happen.

Put another way, joking or snarky responses from a human aren't just jokes, they're a form of social communication. If voice assistants used those forms without intending to communicate the same thing, that would just be frustrating in a different way.

Or with more snark, responded with, "I don't know. Look outside".

One of my meteorologist friends always answers "What's the weather?" with, "The state of the atmosphere."

“What's the time?” “What's up?”

The possibilities are endless.

I've actually had this with Siri. I've asked about the wind and Siri gave me the definition of wind.

Other times it will tell me the wind speed.

Other times, for the same query, it will tell me the wind speed AND direction, which is what I want.

But it's always random what I get: wind speed, wind direction, a combination of both, or (occasionally) the definition of wind.

they spend a lot of time with these requests i think. “sing me a song”, “are you insane” etc

For your neighborhood or was this a more structural problem and did your collegeas find more locations with similar issues?

IIRC it had something to do with the parent hierarchy of my neighborhood. It was parented by both a postal code and a city, or something. I'm probably making that up, but I'm sure you can imagine how difficult it is to create and maintain abstractions for every possible geographic situation.

I wasn't aware of any other occurrences.

Not only a zip code can span cities, a single city/zip can span multiple tax authorities e.g. parts of Redmond, WA zip code 98052 are under Regional Transit Authority and others are not.

I think that pretty much any assumption we're making about strict hierarchy are bound to be broken at some point.

I worked with zip codes on a project. Zip codes are a nightmare. They cross everything. There are zip codes which cross state-lines. Zip codes which are not a single contiguous area of land. So many issues.

I feel your pain. It turns out there are even a group of ZIP codes that apply only to specific businesses, buildings, or even people (if you're important enough).

That's because zip codes aren't defined by an area, but by postal routes.

Zip codes don’t denote geographic regions; they denote nodes in the postal routing network.

That’s a fun experience to debug a problem, I bet if that problem goes thru customer service, it probably will not unfold and let alone a fix.

Siri doesn't know that my front door is called "FRONT DOOR".

I only have one smart lock, which works perfectly, and it is called "FRONT DOOR" in HomeKit.

When I ask Siri about my FRONT DOOR she responds that she cannot find it.

When I ask Siri about the status of my DOOR, she responds with "The FRONT DOOR is locked/unlocked".

I'll then say 'Alright Siri you literally just used the phrase "FRONT DOOR" five seconds ago and the text transcript on the screen says "FRONT DOOR" hey Siri is my FRONT DOOR locked'

Siri: WTF are you talking about? You don't have a FRONT DOOR.

"Hey Siri is my door locked"

Siri: Your FRONT DOOR is locked.

Google and Alexa handle things flawlessly.

About once a week Siri and I have this conversation:

Me: "Siri, turn off the bedroom lights."

Siri: "OK. Your 6am alarm is off."

For the most part Siri works for me, with the exception of the above and her insistence on adding "ginger ale" to my grocery list as two items.

/Native English speaker, specifically trained in non-regional diction because I used to work on-air in radio.

Me: "120 degrees Fahrenheit in Celsius"

Siri: "Contacting emergency services in five seconds"

To be fair I was in a noisy environment and Siri only got the "120" part but seriously, why would that be okay? My phone is registered in America with an American phone number and English set as it's only language. Why should it think 120 is equivalent to 911?

Looks like 120 is the emergency number for ambulances in China, emergency at sea in Norway, police in Guatemala, and national police in Bolivia [1]. Alternatively, maybe it interpreted "120 degrees Fahrenheit" as a body temperature reading for an extremely high fever?

[1] https://en.wikipedia.org/wiki/List_of_emergency_telephone_nu...

I was having problems with a phone service where I needed to "press 1 to continue," but it wasn't registering. Eventually I ended up pressing 112, and my iPhone displayed "Calling emergency services," even though it's an American phone in America.

I know some countries use 112, but that's too many edge cases colliding.

In fairness here, 112 is an international standard (I think it's in the GSM spec?) and is expected to work in the US.

I guess that makes sense. But it doesn't explain why the iPhone intercepted those digits when I was punching into a phone menu system.

>Native English speaker, specifically trained in non-regional diction because I used to work on-air in radio.

Well there's your problem (/s): https://youtube.com/watch?v=Avp9aUkM5g0

It could be worse. It could have said "OK".

In the past week:

me: "Hey Google, add half and half to the shopping list."

gh: "I've added those two things."


me: "Set an alarm for 2:30 tomorrow"

gh: <generic alarm set response>


wife at 2:30AM: "hey... HEY... why's the alarm in the kitchen downstairs going off?"

well you specifically said 2:30 tomorrow.. it sounds like it triggered at 2:30 next day

And never-mind the literal interpenetration of the command - it's far more usual to set up an alarm very early in the morning (got to catch a plane, unusual event) rather in the afternoon.

Very true. I hadn't specified AM or PM as to when I wanted to start the dinner roast. :-/

It's nice to see that using siri is a normal thing to do :) People at work make fun of me for using it.

That conversation sounds like Siri thinks you're about to go to sleep and wants to make sure you remember to enable your wake-up alarm.

No, that’s what Siri says when it disables your alarm lol.

She wants you to get more sleep so she's disabling your alarm.

Not in my experience. This is how 80% of my Google Assistant conversations go:

Me: "Hey Google, play Nine Inch Nails, you know, the one in my Google Play library"

Google: "OK, playing Nine Inch Snails, a band nobody on Earth has heard of and is definitely not in your library!"

Me: (Repeat a few times, trying all kinds of accents, eventually I get tired of songs about nine inch somethings, and I pull the car over and type in by hand what I'm looking for.)

This seems, to me, to be something else. Google Music consistently avoids playing the most popular version of a song. It's always a cover by no one, or a different song by no one with the same title. I've started to think it's a royalty thing.

Ah but Nine Inch Snails is a classic https://www.youtube.com/watch?v=S0qQ1pomYzk

And what bugs me the most about the Google Play Music Library is that it won't sync up with the YouTube Music Library... They have two different music services which I can use with one subscription but I can't have some of the songs on both of them or have one single library.

You're going to have to, as they're going to merge


Thanks for calling this out! I guess I really should check regularly for every Google service I use if it's being discontinued yet.

There's that Google Graveyard site, maybe they should make an aaS out of it, e.g. you can do an API call with services you use, and it will respond letting you know if any of them is about to be euthanized by Google...

All these speech assistants are almost stateless, aren't they? I mean, they can ask you questions and go into a listen-for-an-answer state, but, at least last time I tried, you can't have a conversation about your conversation with them and improve their comprehension of anything. It's like talking to a command line, or those old Palm Pilots you had to learn a special alphabet to scribble on.

It's quite interesting to hear very small children talk to voice assistants. Probably not surprising, but it seems like in normally-learnt human communication you expect your conversational partner to remember the context of what you were saying, and carefully forming canned commands is a separate learned skill. It suggests these voice assistants have still got a way to go, and it seems more like a paradigm change, a big leap, than just incremental improvements.

When i made a meeting room reservation chatbot the hardest part was managing a conversational context without the user getting stuck.

In the end I had a “draft booking” and a conversation loop, where the bot would repeatedly ask to fill in missing parts (eg nr of participants) and then give you a summary and opportunity to correct things. It was hard to do, and definitely required a lot of contextual understanding of how people book meeting rooms. That approach doesn’t scale up well.

I think the basic problem is being stuck in a local optimum. The scripted bot approach doesn’t scale to complex conversations, and you need to start from scratch to do better.

Ahh the good old "conversation is a state machine" pitfall. Even linguists I work with do that sometimes, I guess it's how the simple models that we're taught with work.

Wanna have the simplest parser? Finite State Automaton to the rescue! So people automatically assume that a the simplest approach yo conversation is also something like a finite state machine.

Here's the thing. The only reasonable FSA would be a clique.

You can always move between nodes.

A much more feasible approach is the "actions competing for repelevance" one. Where you have global state manipulated by actions, and all the actions generate a "appiccability score" for the given user input. The system then chooses the most appropriate action, and it does it's thing. And on the next user input the cycle repeats.

Honestly, I can forgive the lack of context awareness. That's a hard problem. I have issues even getting consistent responses to the same query over time (even back to back in some cases). Sometimes, Siri will misunderstand me and fail to do the thing, but then I look at the text that's transcribed...and it's correct (I.e. the backend was replying to a different transcription than I saw on the frontend).

I've just been trained to not bother. Unless I'm setting a timer, I just don't try anymore.

> you can't have a conversation about your conversation with them and improve their comprehension of anything. It's like talking to a command line, or those old Palm Pilots you had to learn a special alphabet to scribble on.

Which is why I have no confidence calling it AI if its not even intelligent. Its just voice recognition on preprogrammed operations.

>Which is why I have no confidence calling it AI...

That's because it really should be called Simulated Intelligence, and would be a much more accurate description. The marketing team wouldn't like this though.

Are they even simulating intelligence. To the parent poster's point it's really just simulated voice recognition. No intelligence even gets simulated. This is more like running cucumber scripts based on voice recognition.

It's an artificial idiot, plain and simple.

yeah, that's what I'm wondering about - is it going to be hard to implement things like this until LSTMs with better "memory" get good enough for consumer use?

You can‘t even have the „thank you“ „you‘re welcome“ part of the dialogue.

I don't know if it works there, but siri couldn't understand one of my contacts names

When it read back super incorrectly from an alias, I said something to the effect of "could you pronounce that correctly?" and it asked me to say it

Since then, it's understood that person's name. ¯\_(ツ)_/¯

It really needs to expose the option to train those easily

Indeed, you can say "That's not how you say that" to correct its pronunciation of any name.

There is a "Pronunciation" field in the contact where you can spell out how a name is pronounced.

This article clears everything up, https://discussions.apple.com/thread/8116586

Similarly, I have two smart outlets and one smart lightswitch in a room. One is called "Drew's LEDs", and the other is called "Drew's Heater". When I ask google to "Turn off the lights!" it turns off the switch and the "Drew's LEDs" smart outlet, but not the "Heater" smart outlet!

I definitely appreciate the effort they put into understanding the semantic nature of a device from the name I assigned it. Nowhere did I ever designate one of the smart outlets to "behave like a light."

Scary! I would never have anything involving heat (oven, space heater, etc) or water connected to the IoT. Way too much risk of fire or flood from a bug or a hack.

The people who cut my grass are in my phonebook, under, let's say "Lawn Cutting Corp".

If I say "Siri, call lawn cutting corp", she'll say "I'm sorry, I can only call a single person at a time."

If I say "Siri, call lawn corp", then it immediately opens the phone and says "Dialing lawn cutting corp."

What's most likely happening here is that any door can be classified as "front door", and Siri doesn't know whether you want to open the door which has been marked as "front door" (you have none?) or the one named "FRONT DOOR".

File a bug and it will probably be fixed.

My computer is a laptop, which is on my desktop, and there is a desktop on that laptop, and on the desktop is "My Computer".

I think I just made a palindrome of homonyms.

is it named in capital letters? I've noticed some systems will see capital letters and read it as F-R-O-N-T etc. Maybe try renaming the door to lower case

At this point better stop using Siri and just check by yourself no :D ?

This example illustrates how difficult AGI is and how far we are from it. We, humans, tend to take advantage of the context to make communication simpler and shorter. Just think about all the implications of this one simple question: what time is it in London? Or e.g. how can I get from London to Dublin?

If the person asking the question lives in Ohio, they may actually be talking about London OH (or Dublin OH). Some people in neighbouring states may mean the same, though they will be more likely to mention the state. However, how close should you be to London, OH even within the state to mean the Ohio one and not the UK one? How close is close enough? Is a few hours of driving close enough? A 3 hour flight? What if I'm roughly at 6 hours from London OH and 7 hours from London UK?

Further, if the person is a British expat in Ohio, especially if they are working for a multinational business (or not), they would more likely mean London UK. German expats, though? Russians? Or an Irish person who lives in Amsterdam having some relatives in Ohio US, looking to book a flight to Dublin. Etc. etc.

There are so many contextual layers here that even human assistants can occasionally get it wrong, and without the context the task becomes insurmountable for the "AI" algorithms. That is not to say virtual assistants are useless, just that selling them as "AI" is a big lie, bigger than even those who market these algorithms as "AI" think it is.

> If the person asking the question lives in Ohio, they may actually be talking about London OH (or Dublin OH).

I would seriously doubt this assumption. Why on earth should someone living in a state specifically ask for the local time in a different location within that same state?

On the contrary, this context information would make it much more likely that the person actually meant "London, England". Except if there is a timezone border going through the state, of course.

However, I obviously agree with your general point regarding the severe limitations of what we currently call "AI" and how little "intelligence" there actually is.

> Why on earth should someone living in a state specifically ask for the local time in a different location within that same state?

True, and that's another contextual layer to deal with: that e.g. the state of Ohio is in a single timezone and that - why on Earth should someone ask the time within the same timezone? - like you said. And then there may be contextual exceptions even from this rule...

you might not know where the timezone boundary is - they move, or you're new to the area. alternatively, there could be a daylight savings boundary in between, so it is only in the same timezone for half the year.

the fact that americans are inclined to say the state name as part of the name of a place could also help - since they might say london ohio, london might be more likely to mean the real london.

Flipping this around a bit: if someone in England asked "what's the time in London?" should Siri assume that they're talking about London in Ohio or Ontario? Everyone in England is on GMT, so they don't need to do timezone conversions to London time.

There's an extremely long-standing convention of referring to timezones by the major or capital city within that timezone, so Siri should of course assume that they're talking about London in England. In fact, this is probably more or less the correct, canonical way to ask this - asking about GMT would not give you the correct answer, since England is currently on BST which is GMT+1. Needless to say, this argument would not apply to asking about anything else about London. Context is complicated.

For similar reasons, anyone who asks for the time in Boston probably means Eastern Time regardless of how far they are from Lincolnshire, though I think the more usual and canonical way of referring to that is New York time.

Pedantic: London is currently in British Summer Time, not GMT (which it is in the winter)

I'd expect it to ask the first time around, and then remember the answer afterwards.

There are many European countries that have the same timezone, but I don't remember all of them always, despite living in Europe.

In this case it is totally normal for someone living in Netherlands to ask the time for a polish city.

> Why on earth should someone living in a state specifically ask for the local time in a different location within that same state?

Who said they were in that state when asking? People travel.

True. So substitute "living" with "currently located in". Makes the problem of correctly considering the context of that question just even more complicated ;-)

I find it unlikely that someone in that scenario would ask "What time is it in London?" rather than "What time is it in Ohio?"

I ask for the time in SF or LA all the time, rather than the time in California. Regardless, it's not hard to find cases in the same state that cross timezone boundaries. Cottonwood, AZ vs Cottonwood, AZ 86503 (on Navajo time) is an example.

SF and LA are major cities. London, OH is not.

I assume it triggers just on location, not in context of a time zone. I think Siri just hears tell me something about "location" and then just defaults to nearest.

> Why on earth

Just use your imagination a bit.

Maybe I live in state X at location Y while my parents live at location Z in the same state, about 250 miles away from me, maybe there’s a serious storm where I live and I wonder if I need to check on my elderly parents?

But you'd still be in the same time zone, no? For different queries, like the one you mentioned, that might be reasonable, but asking for the time in London means the assistant should infer England's time zone, if you're in Ohio.

Unless you live in a state like Nebraska that spans across multiple time zones…

True, there are definitely edge cases to account for. Maybe the problem is these AI assistants are trying to make everything location-aware when that may not always be the desired behavior. If there are ambiguities the assistant could ask you to clarify, but that might make for a worse UX.

> Why on earth should someone living in a state specifically ask for the local time in a different location within that same state?

Because they want to know what time it is in a different location.

14 states have more than one time zone. Do you know which ones?

> Do you know which ones?

No and that’s irrelevant trivia. What I know is if the state I live in is on that list. Oregon and it is. Don’t care what the other 13 are if I’m asking for a time in the state I live in.

The point is that just because you know the state doesn’t mean you know the time. They’re unrelated. It is reasonable for a person in Portland to ask what time it is in Ontario. They may know there are two time zones around eastern Oregon but where is the line?

Idaho is the same way. What time is it in Riggins? I’m from Idaho and I don’t even know the answer to that question. It’s a reasonable thing to ask siri.

That's a harder question. The important question here is "does your state have more than one time zone?"

No, it isn't important. What's important is how to answer the question that was asked. The number of time zones in a state has nothing to do with the time in a location. You just need to disambiguate what location is the subject of the question.

Well, it's no less important than this.

"14 states have more than one time zone. Do you know which ones? "

All of these situations are fairly complicated. So the best thing to do ( even for a human assistant ) is to ask more questions. That is the best way to get clarification rather than just trying to figure out what the context is. The assistant should be able to ask a simple question: "Are you talking about the London in UK or Ohio?"

The problem is, for some subset of people, the question is ambiguous. Thus we can reduce the problem to finding that subset and asking additional questions only in those cases.

Exactly. And what about if I initially DO specify it's London Ontario that I'm interested in but five minutes later I ask again, referring to it simply as "London".. shouldn't a "better Siri" come back with 'assuming you still mean London Ontario, it's...' ?

I think it's possible that general acceptance of these non-AI gimmicks being referred to as "AI" will end up pushing genuine progress in true AI further into the future.

Agreed. These solutions seems gimmicky. AI as it is today is really a chain of "if"s. I think aside from some amazing results in niche areas, so much of the attention to ML in general (and DL in particular) is just due to marketing. It's a bubble in many cases, but since everyone is doing it, companies decide to do it as well.

I have worked on that problem recently, and yes it is really hard.

You will be surprised on how many cities are named 'San francisco' in the world : https://en.wikipedia.org/wiki/San_Francisco_(disambiguation)

I often think city names must constitute some part of their citizens' identity. How is living in a fake San Francisco pleasant for people who live there?

I think you're making basically the same mistake that Gruber is in the article, and that all "location based services" are making. The context isn't the current location, it's where people's attention is.

To take Gruber's example, if you had an office in London Ontario, were talking about setting up a video call, then asked your assistant "what time is it in London", and they picked London England because it's the most famous, you'd question how smart they were.

The context is not where you are, or which "London" is largest or most popular or least driving distance away or where you grew up or where you lived once, the context is why you are asking about the time in London, it's all of the present brain/attention/conversation/local state bundled together.

> you'd question how smart they were.

If I'm physically in a city named London, and you ask me literally "what is the time in London" that's actually an ambiguous question. Why would you specify the name of the place you're currently in? Typically someone in that instance would just ask "What time is it?"

I don't hire people to make "educated guesses" on my behalf. If they detect an ambiguity then I expect them to initiate a dialog that resolves that, not just blithely pick the first thing that comes to their mind.

> If I'm physically in a city named London,

But Gruber is not in London. If he was, perhaps we'd have a discussion about what the right answer might have been, maybe we need some more clarity, that kind of stuff. If I ask a stranger outside (well, if I did before everyone isolated themselves) they would immediately give me the time in London, England, and if I actually wanted the time in Ohio I would have to clarify myself.

> I don't hire people to make "educated guesses" on my behalf.

Of course you do. Do you really want people asking you questions all day when there could be any ambiguity?

Anyone want to guess where this sign is located?


I wonder if it actually points at Mecca…

It points to the town of Mecca which is close to the Salton Sea in Southern California.

Looks to be in Kentucky. Has both a Paris and a London, only 90 miles apart, so it's easy enough to pick a point on highway 64 that would be nearer to Paris either coming from Louisville or Huntington. Impossible to know which direction though without more context.

As a non-American, can someone explain why there are US cities with the exact same names as cities in Europe, Egypt, Greece, etc.?

Because settlers from Europe came to America and didn't give af what the Indians called their land. So they named places after ones they knew; New York, New England, Boston.

Toronto, Ontario, Niagara, Ottawa, Canada, Mississauga.

Canada kept a few.

I guess mississippi is one but most US places have white names.


Heading west from Clappison's corners near Waterdown.

Lol look at that, a London and Paris right next to each other in Ontario too... nicely done.

in the US, based on the sign style and the road striping, and the drainage grate

On the side of the 403.

somewhere near Versailles, KY ;)

I wonder which will come first: AI that is truly able to teach itself about our world, or that we are able to define algorithms or otherwise figure out software solutions to most of these these context problems. Of course, if an AI is intelligent enough that problem solves itself.

To me it seems that developing AI on that level will be here sooner than developing solutions to many context problems, given the difficulty the best funded algorithms in the world have answering this question which we humans see as very simple.

That's exactly what gave rise to the first "AI Winter". People had to hard-code things. But with NN and esp. Deep Learning, they thought there's no need to write programs for pattern recognition anymore; the "AI" just learns it on its own.

That wishful thinking has turned out to be dangerous tho, as we have moved towards an ML-dominated world where we don't even know how the ML algorithms produce specific results.

Add that to their bizarre behavior (like this example with Siri) and you'll realize chances of another AI Winter are not low. If we have to develop solutions to many context problems one by one, that may reduce so much of the hype and interest in "AI". We'd be basically back to square one.

> We'd be basically back to square one.

But with better tools! It might not be going from 0 to 1, but going from 0.1 to 0.2 is still progress.

It is THE area people have been working on in AI.

Multi-hop reasoning models have started working surprisingly well. ie. reasoning over multiple levels and conditions.

Common sense reasoning is also getting a lot better. By having huge knowledge bases, the model can actually learn some degree of human like general purpose context. Such as, returning the time at the London which has the most similarity with the user's hidden representation.

There's a higher-order set of contexts as well. Everyone in the conversation (the asker and the answerer) knows that "London" likely means "London, UK" - that's the Bayesian default. So the if the asker wants to inquire about London, Ontario (and they are not continuing a previous conversation about London, Ontario), they are very likely to be explicit in their ask: "What's the time in London, Ontario?"

The asker has a mental model of the answerer's default contexts, and if their question is likely to be ambiguous in those default contexts, they are more likely to be explicit in their ask. The converse is also true - if the asker is not explicit about which London, that's actually a signal to the answerer to lean even harder on default contexts and best guesses.

Humans do this without even thinking because though culture and conversation we are quick to arrive at shared mental models and lean on shorthands "in the other person's head". AI not only doesn't have it's own context, it doesn't have an estimate of it's user's context and where the two might differ.

I disagree. Should the resident of London, Ontario say it every time to a virtual assistant about the weather when it should be obvious they are interested in the place of residence? And so on, all the same context layers apply (British expat in London, Ontario, etc.)

If they were somehow calling a global hotline that could answer arbitrary questions for free, then yes, I suspect someone in Ontario would still clarify. If it was a local hotline, maybe not.

Which is Siri?

Some additional variations:

What if you were calling an individual human personal assistant who knew you lived in London, Ontario?

What if you had previously clarified to this person that when you said London, you meant London, Ontario?

I think both of these questions ought to be relevant to the digital personal assistants that we're creating.

How about asking for context if it's not clear? Any reasonable human assistant would ask if the context is not clear (enough).

Obviously virtual assistants should ask for clarifications. Now imagine every time you ask for the time, weather or a flight to London and you always mean only London UK, imagine how annoying that can be. Makes me wonder if VA's are more productive in such cases compared to typing.

"The time in London, England, is 8:23pm."

"No, I meant the time in London, Ontario"

"Sure. The time in London, Ontario, is 3:23pm."

You start by doing a best guess, and actually listening for a correction. For other kinds of requests, you reply with a best guess and ask for confirmation or for clarifying questions.

Hard, yes. Not impossible.

Right. A good test for a conversational AI is whether it could perform the Abbott & Costello “Who’s on First” sketch with you. It should make assumptions where the characters in the sketch make assumptions, and ask questions borne from confusion where the characters do.

"The average time in London is 5.53 pm"

I mentioned in another comment that with Siri, this exchange does work:

“What time is it in London?”

... gives answer for London Ontario

“No, London England”

... gives answer for England

However there’s no memory to this; the same thing happens next time you ask.

The problem can be reduced to finding the subset of people for which the question is not ambiguous (or the analogous).

The solution for it is, sadly, more data. So I imagine if Google or Apple can listen, "see" and has access to your every electronic communication do. They could eventually build a model that "knows" everything about you. I am pretty sure we have the technology to do it. But the privacy implications for this is terrifying.

> The solution for it is, sadly, more data.

I don't see why that would be necessary, since a human does not need to know everything about you or have access to everything you've ever communicated to anyone to guess accurately that when you ask about "London" you probably mean London, England.

>to guess accurately that when you ask about "London" you probably mean London, England.

It depends on your circles/bubble and the context.

Sure, for me and almost certainly the majority of people the majority of the time, assuming London = London, England is almost certainly the correct disambiguation. However, maybe not someone in Ontario or Ohio asking a question about "London." And I expect that the person who sees London, England as this far away place they certainly don't have regular questions about would find that always being the default annoying.

> It depends on your circles/bubble and the context.

But not on the intimate personal details and communications of the individual person, which was what the post I was responding to was about. Sure, there are going to be circumstances where London, England is not the most likely guess, but you don't need to have access to someone's entire personal history to know what those circumstances are, since a human can spot those circumstances without having that knowledge.

> The solution for it is, sadly, more data.

The problem is these "AI"s are plain stupid. The solution for it is moving on from gimmicky and hacky solutions to true AI.

> This example illustrates how difficult AGI is and how far we are from it.

Does it? AGI is very difficult, but I think this example only illustrates that Siri is kinda bad, given that DDG, Google, Alexa, and Bing all got it right.

TBH I always feel amazed about how worked up people get about stuff like this, especially people familiar with software who should know that there are millions, maybe billions of edge cases like this in a generic knowledge system, and thus at least it's easy to make a mistake like this. I mean, the time it took him to write his blog post is probably more than all the times it would take him to follow up with "What time is it in London, England?" It reminds me of someone who commented that "there must not be any black people who work at Apple" because it pronounced "Malcolm X Blvd" as Malcolm 10 Blvd.

I mean, if anything, just appreciate how amazing humans are at differentiating these ambiguities.

I agree with you, but then consider all the influential people who think the exact same category of technology--AI aka machine learning--will produce a self-driving car that is safer than human drivers, or take away every human job in X years.

I think it's very worthwhile to point out these seemingly basic errors as a way to maintain appropriate skepticism about the limits of our technology.

But the post points out that Siri from other Apple devices gets it right. Apple’s “generic knowledge system” can answer this. It’s only Apple Watch which has trouble.

That’s kind of weird. It’s not that Siri is especially bad. It’s that “Siri” is something different depending on how you query it. Other online search systems aren’t like that, and integration and consistency are typically Apple’s forte.

Seems like an easy plausible explanation is mobile devices (like a watch or phone) take your current location into account, while a stationary device (like a HomePod) does not.

Look, I'm not arguing it's not a bug, but I'm just really surprised at how software people, who I think should know better, are surprised that such bugs exist, or more importantly that completely eliminating all types of this class of bugs is basically impossible with current technology.

Answering (with a human voice!) the correct time zone for the wrong "London" is about the mildest possible bug you'll ever see. I might not even call it a "bug". Let's call it a mild inconvenience of modern life!

The aspect that's ruffling feathers, I believe, is that it's one of those cases where someone might have reasonably assumed something was built one way 'under the hood', and was confronted with an effect which forced them to see that it was not implemented that way at all. The issue isn't the 'bug'. It's the realization that their mental model was wrong.

Specifically, something has a name ("Siri") which might lead one to believe that everything from that manufacturer using that name refers to the same thing. (Isn't that the point of a name?) Clearly, it's not.

Your hypothesis sounds plausible, so I tested it. I have a Mac laptop, which has the same 'Location Services' that iOS has (AFAIK). I asked Mac Siri what time it is in London, and got a response for the one in England (further away from me). So that doesn't explain it, or at least not all of it.

Siri is often self-inconsistent. Ask a query again on the same device and you might get a completely different answer.

Exactly. I'll speak into my AirPods "Hey Siri, 30 minute outdoor walk" to start a 30 minute workout on my Apple Watch.

Half the time, Siri replies with "I'm sorry, you don't have an app for that. You can try searching for one on the app store."

Repeat the same exact query to the AirPods seconds later, and bam, it starts the workout on the Apple Watch.

Just because it's hard to implement right doesen't mean Apple should get a pass. I mean, they aren't forced to create a voice assistant by regulation. If they can't make a good one, (and this applies to Amazon and Microsoft too) they should have just left it in the lab until they can.

> they should have just left it in the lab until they can.

Totally disagree, because the only way these assistants get better is with real-world usage (which is why I can definitely agree that one should wonder why Apple isn't improving as fast as the competition).

It was only a couple years ago that using Google Assistant was an extremely frustrating experience. I'd say it got about 5-10% of my words wrong, which meant it got my intent wrong about 25-35% of the time. These days I find its accuracy uncanny - it almost never makes a mistake with most of my "standard" queries. No way it could have gotten that good without real-world feedback and data.

It is still an extremely frustrating experience. All of them: Google, Amazon, Apple. That it gets it wrong, ever, is too much for me. The first time I ever had to repeat myself getting directions on the road I was done.

I don't understand why anyone other than hobbyists can stand to use these things. They are so obviously years away from ready for serious use, and the novelty value wore off years ago.

Not shitposting here, these are serious comments about absurdly bad UX.

> That it gets it wrong, ever, is too much for me. The first time I ever had to repeat myself getting directions on the road I was done.

How can you stand to deal with humans?

Humans are so many orders of magnitude better at comprehension that your comment feels a bit disingenuous.

So they should go completely counter to the way that businesses traditionally function and keep something in a lab for potentially decades instead of shipping a "good enough" solution that can be fixed along the way? Especially one that would get valuable information and edge cases from a live system? It's not like it's safety critical software. It's neat to have, sure, but you won't die if you get the wrong London's time.

> shipping a "good enough" solution

I think the debate here is about whether anything has shipped that is actually "good enough". I don't think it's that controversial to avoid shipping stuff that's not good enough.

I think it's good enough. Sure, in this very cherry-picked example it fails. But for a lion's share of functionality it works just fine.

No way

This is the difference between a usable product and something that is not

If my voice assistant is going to make a significant % of errors, it either needs to be very cheap for me to correct it (it's not -- usually you retry and if it keeps failing what do you do?) or I'm going to stop using it

Steve Jobs made great, not passable, products. It's a shame that Siri is so far behind

Siri may be a marvel of modern technology. If the competition does better, than it's reasonable to complain about Siri.

But if you are Apple and building the Siri product, you think this would be one of the first use cases you would develop and/or test. I remember asking Siri to convert a currency for me and it just showed me search results for 'currency conversion'. Again... wouldn't this be a query you would test for?

That's actually a pretty typical blog post for this guy. So I'd say it's more like "some people", not just "people".

Ranking and NLP aren't easy. If you are asking a slightly related question (for example, "What is the weather in London"), and if you are living at some place where the nearest major town is called London, but is not London in England, you would expect it to give you the weather in "your" London. However, it you are asking for the time in a particular city, then the ranking should of course consider whether the timezone of the city you asked for is different than your own - it makes no sense to ask for the time in a city which lies in your timezone. Then again, if the distance from your location to that other London is greater than a certain threshold, the question could imply that you actually do not know whether the city lies in the same timezone as your location.

All these thresholds or ranking factors seem to come intuitively to humans (I would guess a good intuition for them is actually a sign of intelligence), but it seems to be incredibly hard to capture them in ranking.

As others have pointed out, a solution here would be to make Siri more conversational. A simple "Which London?" could've removed the ambiguity and given Siri the opportunity to learn something about that particular person (that London, England is more important to him than London in Canada).

> A simple "Which London?" could've removed the ambiguity and given Siri the opportunity to learn something about that particular person (that London, England is more important to him than London in Canada).

IMO I would be very disappointed if Siri started asking clarifying questions at a significantly higher rate. Siri is already a bit too chatty, and I never feel like having an extended conversation with her.

I’d rather she just say the wrong thing (but make it clear that the answer is for a specific London, e.g. “The time in London Ontario is...”) and I can correct her. It’s the same number of conversational “turns”, but in the happy path when she actually gets it right the first time, it’s one-shot and done.

It’s a lot harder to get signal on this for learning, but I feel like there are ways around this as well. (Maybe saying “thanks” can signal she got something right, and prefixing the next utterance with “no” could signal it was wrong...)

Alexa responds to "wrong", and many variations of that, with cancelling the previous action and thanking you for the feedback.

> Siri is already a bit too chatty, and I never feel like having an extended conversation with her.

Tell me about it. I have to unpair my bluetooth headphones every day, for stupid reasons that aren't Siri's fault. But when I say "Siri, open bluetooth preferences", it parses my command on screen VERY quickly, and then slowly enunciates "Okay! Let's take a look at your bluetooth settings." I'm just tapping my foot and waiting for her to quit talking.

Of course, then, 1 out of 3 times it takes me to the wrong settings page. Because if Settings has been opened recently, it can't deep link from Siri. /shrug

As others have pointed out, a solution here would be to make Siri more conversational.

But that would make it almost as smart as an Infocom game from 1981. Something, something, doesn't scale, mumble, something...

The problem sounds a little like collaborative filtering. If you have a certain affinity with cities A,B,C, then you can compute the expected affinity with a city X by looking at the affinities other people have with X, and their affinities with A,B,C.

Instead of looking at people, you can also scrape websites to get the relations. But here you may get a recursive problem because if a website speaks of "London", you might not know in advance which London they speak of.

I now have most of my adult years learning how to construct a phrase that gets the right results for Google, and later Assistant. It got to the point where I'm certain it must be a headache for whatever team is trying to support natural language processing in these - all proficient users ask for some artificial gibberish and get where they want to be.

Here comes my favourite brain freeze moment - recently my parents asked me to explain this to them. How do you construct a good search phrase? My brain blanked. I HAVE NO IDEA. It seems I have learned fluent Goonglish without noticing, and now can't explain the grammar or vocabulary of it.

My 2yo daughter reeeeally loves the Aladdin soundtrack, but the French version (we're in Québec). It took us way too many tries to get our Google Home to play the correct version with voice commands, even adding the language ("Ok Google, écouter Aladdin en français") wouldn't work. I now have a small note with the correct incantation posted besides the Google Home, because even forgetting a single word will play the English version instead. (For the curious, the correct incantation is "OK Google, écouter Aladdin bande originale française du film").

I would summarise it as: use separate keywords instead of sentences. "Change Light Bulb" instead of "how to change a light bulb". "Black Science Guy", "Kevin Durant height", "rails has_many api", etc...

Recently Google got much better in understanding full sentences and there are tons of SEO optimized pages for certain phrases. Nevertheless, using keywords is what I imagine advanced users do.

It also got much worse at keyword searches. It seems like the one capability came at the cost of the other.

That's definitely what people were referring to when they had good googlefu abilities. It was always odd to me when people would have trouble finding stuff and come to me for help. I somehow picked up that language to search effectively while growing up during search engines infancy.

It's weird that it feels like my tried and true abilities are getting worse. Or Googles algorithm is hurting some of us that became very proficient in very specific ways.

I wonder if anyone is working on special languages for talking to voice interfaces. Maybe a reduced grammar would allow for better recognition accuracy and reliability. And we could get more helpful corrections.

The problem reminds me of the difficulty of programming in applescript. In applescript, articles like "the" can be inserted optionally in the code, and there are lots of equivalent ways to write things, i.e. "if x equals y" is the same as "if x is equal to y". As a result I never remember the syntax, and error messages are less helpful.

From my limited understanding, even in lojban, a constructed language with unambiguous grammar, you can have semantic ambiguity.


The fact that there isn't a feedback mechanism to let the Siri team know that it responded incorrectly tells me everything I need to know.

Until they have real metrics around how often Siri fails they will continue to think that their correct response rate is great.

Companies usually hire editors to manually annotate the quality of the algorithms on anonymized samples. They don't need direct user feedback to gather quality metrics. You also can't tell which step failed in the pipeline: speech recognition, query understanding, or the actual search.

Why can't you know which step? It's their system..they can build as much tracing as they want into it.

Why would editors make user feedback any less valuable? It's hubris to think it's not.

I'm just waiting for my toddler to start repeating "Hey Siri, nobody was talking to you."

My preschooler does that whenever I’m working with a tool that requires me to bend my wrist backwards and the watch-Siri activates.

“Sorry, could you say that again?”

Nobody asked for you to interrupt my chopping, Siri.

They likely don't need the manual feedback as they would be able to tell that someone asked the same question twice in a row, and then look at the data to see what the issue was.

The craziest and most confusing behavior of Siri for me is:

Sometimes you can ask a question and watch it be perfectly transcribed in real time, but then receive a nonsensical answer from it. Ask the exact same question immediately after on the same device, transcribed exactly the same way, and get the correct answer.

Where does such unpredictability come from? How can Siri transcribe the words correctly but fail to deliver the right answer?

> Where does such unpredictability come from? How can Siri transcribe the words correctly but fail to deliver the right answer?

Voice assistants generally use both the text transcription and a bunch of contextual metadata as input. That metadata could include things like what's currently visible on the screen, your location, your recent queries, etc.

So even though the underlying algorithms powering the assistant may be deterministic, the input data between two seemingly identical queries could vary quite a bit.

For instance, Siri almost certainly has context around the previous questions you've asked. It would be reasonable to assume that if an assistant received two identical questions back-to-back the initial answer was wrong.

In that scenario, the assistant might decide to use the a different answer (perhaps one that had a lower ranking) in an attempt to get it right.

I looked up the last screenshot I had of this. What I asked was "In 2 days remind me to call FRIEND_NAME", and Siri created a reminder that just said "call". Transcribed perfectly, wrong content in the reminder.

I tried it again right after, and the reminder said "call FRIEND_NAME".

I don't think there was any previous conversational context or anything like that. Hard to fathom how that could happen.

Maybe some overly aggressive A/B testing?

Interesting. I can see how the basic algorithm might go "I've been asked for the time (or temp or whatever) in cityname. Citynames are routinely reused globally; what is the closest such cityname?"

But that fails completely when you get to names like London (or Paris or Moscow or Cairo).

But it happens with people, too. I'm from Mississippi, though I haven't lived there since I left for college. I now live in Houston. At a family reunion many years ago, I ran into a cousin I hadn't seen since we were kids. She asked where I was living, and I told her.

"Oh, isn't it terrible about that wreck?" she asked.

Baffled, I asked for more information. "Oh, you know, that wreck over on 406!"

I did not know. "I'm sorry, Houston's really huge. I don't know what wreck you mean."

"Oh, did you mean you live in Houston, TEXAS? I thought you meant Houston, MISSISSIPPI!"

I was, at the time, about 30. I grew up in that state, and lived there until I went to college. And until that moment, I had never even HEARD of Houston, Mississippi (a metropolis, it turns out, of about 3600 people in the misbegotten northeast corner of the state).

It just feels like what is the closest such city name is pretty obviously an insufficient test.

To approximate what a human would do, one would presumably want to start by ranking places on a range of dimensions:

* nearest

* biggest (or maybe size category: big city, city, town, village)

* how many times user has asked about this place before

* how recently user last asked about this place

* ...

If most/all these rankings put the same place in the top spot, go for it! Otherwise, ask the user for clarification.

I wonder if there are Chinese cities with names close to major world cities. Paris has 2.15m people, its metropolitan area has 12.x million. A medium sized city im China has more than that. If by coincidence it's also named Paris, getting that as the search result would be annoying.

Oh, I absolutely agree. But I can see how that might have come in as a first-assumption that was never revised. Which, if you think about it, is probably a huge class of bugs industry-wide.

To throw a tongue-in-cheek additional example in here: the population of London, Ontario is (significantly) greater than the population of the City of London, UK.

Reality is hard. And with machine learning (especially proprietary, remotely-hosted machine learning) there's rarely a way to pinpoint a line of code and say: "this is what happened and why you're now frustrated and firing hypothetical personal assistants".

Yes, but in the same way people don't usually mean "London, Ontario" when they say London, they also don't usually mean the City of London (which, for the benefit of people who may not know, is a tiny portion of London with a population less than 10,000).

A group of us once tried to rank cities in Europe by population only to realize that most of them are effectively incomparable.

Cities sometimes have clear legal boundaries that feel irrelevant to the question, like the City of London, but more generally have metro areas that sprawl well into an ambiguously defined countryside. There's rarely a "this block is city, the next block over is clearly not" situation, so the number of people you include ends up being pretty arbitrary.

While it is arbitrary, and not a city (in the sense that it hasn't received that status from the Queen), Greater London is absolutely a defined administrative area in the UK, with a governing body (the London Assembly) and a mayor (the Mayor of London -- not the Lord Mayor of London, who is the mayor of the City of London).

Anything can be ranked as long as you clearly define the ranking metric first.

When ranking by population it often makes most sense to use the population of the metropolitan area. That is, to ignore the administrative divisions, which vary too much, and focus on the physical reality of the urban area.

Just to make things a bit more complicated London has two cities. The City of London, and the City of Westminster, which is also a borough of London.

Two cities in one City; it shouldn't be allowed.

It's time to refactor London.

Already done. We're talking about "the next gen" and it's still dodgey.


Yep; my (indirect) point is that there are multiple possible reasons why Siri may have made the judgement that London, Ontario was more relevant when answering.

My guess is that Apple would find it difficult to provide robust references to John to explain why it happened, or how they've fixed it for him (and whether that fix is a one-off workaround for his complaint, etc..)

People always expect the 'obvious' interpretation, but sometimes it's difficult to define exactly what that means for everyone. As another poster said here, context is also a very important factor.

Precisely. It's the user's context that matters, and that can change even for a particular individual at different times and locations.

Remote, proprietary personal assistants tend to apply their own (generally unknowable and unaccountable, from the user's perspective) interpretation of the context.

Considering (IIRC) asking about the weather is what they show on the TV ads, they should've made sure their city selection logic returns the answer that most people would find acceptable..

Might as well comment, first employee at Siri here. This result “maybe” should provide ambiguity resolution, but where does it stop. The she/he who compared it to Google was right on. Siri provides singular results in most cases vs multiple search style results. We did use geo for locality based results in the past. This would solve the problem the OP mentioned, not sure if they call location for these requests now. The other person who mentioned we can’t/couldn’t train on data is correct too. Again, privacy first. Be proud, Apple cares a lot. When the Siri commercials hit (No one told us there would be commercials) when we launched, we got decimated, and couldn’t debug the issues, user utterances were not allowed to be logged. Luckily, after much sleep deprivation, one of my engineers (love you Stu) said, “hey, aren’t they running commercials” to all our surprise. We convinced the privacy team to let us log word parts. Then we started to see words that were present in the commercials. Fun fact, also happened when Tom Cruise was presenting at the Academy Awards. We had millions+ asks all at the same time, again word parts. “height”, “tom”, “foot”, etc.

There's a Woodland Hills in Utah and a Woodland Hills in California. If you ask Google what the weather is in Woodland Hills, it will ALWAYS give you the weather for California. Even if your current location is Woodland Hills, Utah and even if your address is set to Woodland Hills Utah in your Google account or Google home.

I live in Alton in Hampshire in England. A town of 20,000ish people. To be fair Google (Maps etc) tend to get the correct Alton, but a lot of mickey mouse site often thinks I am in Alton, Shropshire, a tiny village next to a more famous theme park, Alton Towers. Which is very annoying setting filters job sites as I am never sure which Alton it thinks I am in, local ads that don't make sense etc.

Worse there is an Alton, New Hampshire, which confuses even Google sometimes.

Even worse, Apple seems sometimes confused where I am as I have twice woken up to see a tornado warning on my IOS lock screen in the mornings. I live pretty far from any decent tornados. Unfortunately, I have been too sleepy to prevent myself from unlocking the screen before I remember to screenshot it.

Why would you need to specify your location? You can just ask "How is the weather?" or "What will be the weather like tomorrow?"

Maybe you're traveling and want to know what the weather is at home.

My usual dialog with siri:

Me: "Hey Siri, play Radiolab podcast"

Siri: Which Radiolab podcast, Radiolab or Radiolab: More Perfect"

Me: "Radiolab"

Siri: Which Radiolab podcast, Radiolab or Radiolab: More Perfect"

Me: "Radiolab"

Siri: Which Radiolab podcast, Radiolab or Radiolab: More Perfect"


Me: "The first one"

Siri: "I don't know >the first one<"

Me: "Siri you're useless"

Siri: "That's not nice"

Me: "Could be but it's true"

"Hey Siri, set a timer for 5 minutes"

90% of the time works fine, and it's essentially all I use Siri for. It's very convenient when cooking and my hands are dirty. But 10% of the time I get something along the lines of...

"I'm sorry, but you don't have the Timer app installed".

"I'm sorry, but you don't have the Timer app installed".

"I'm sorry, but you don't have the Timer app installed".

"I'm sorry, but you don't have the Timer app installed".

It's infuriating because I know Siri is dumb so I use the same exact simple phrases to avoid confusion. Sometimes it works, sometimes it doesn't. It always transcribes the command accurately though! I've actually lost my temper and smashed an Apple Watch before over this. This is in my house, on a very reliable network, always with my phone within a reasonable distance.

It's ridiculous how Siri is still this shitty. I have an 11 Pro and even on such an expensive phone I can't really trust it to do anything more advanced than set timers. Every few months I try to do something else and just get annoyed at how bad it is.

Before lockdown I even had it disabled entirely because it would get activated randomly from time to time, even if nobody in the vicinity said anything remotely close to "Hey Siri".

The problem is not just that it is wrong, nor that it doesn't have enough personal information, but that it lacks proper personalisation and the ability to learn.

You can't reply with "no Siri, not that London" and have it remember. It doesn't learn your voice among the people who normally use your Siri in your household.

"Artificial intelligence" is always going to make mistakes, as do real humans. Humans can perform unsupervised learning - in fact it's one of the key skills that employers like to select on! Until AI can learn in context it's going to be very limited.

Siri indeed never learns.

I've had to disable "Hey Siri" because my daughters name is pronounced vaguely similar to Siri. Worst thing is, Siri transcribes what it hears, and it transcribes my daughters name. So it doesn't hear wrong; it just activates on a different name than Siri.

I've tried telling Siri to shut up; but it never learns not to activate when I call out my daughters name.

That might be because the activation words are recognized by a separate chip (so it's low-power and works offline). Whereas the rest of the conversation is with the software service.

At least that's what I heard about how iirc Alexa works.

Activation words are fuzzy by design.

Siri is easy enough so we never looked much into it, but “OK Google” for instance looked like a real PITA, so we did some research before buying an assistant.

It appears a ton of people just intentionally say “Ok GooGoo”, “Ok Boogle” etc., whatever is easier for them to pronounce and it works perfectly fine.

When those designs strip your privacy, 'fuzzy by design' is cery much a bug to the user, and only a feature to the company mining the data.

I may be biased as I helped a voice recognition internal project in a previous life.

It’s a genuinely hard problem to solve, and I am willing to give the benefit of the doubt to Apple for instance when they have humans reviewing samples. There may be other motivations, there’s ton of people in any of these companies, any given feature must be seen from a different angle depending on the department looking at it.

But I think a lot of what we see as privacy violating is primarily an effect of the flaws and all the hacks needed to make the feature work at all (when it works).

It activated when I greeted my cat. My cat is called "Timmie"

Siri is not artificial intelligence-it is speech-to-text + a poor search experience with one result.

I had never used an Apple product before the company which I joined recently gave me a MacBook Pro. I am really surprised how bad the product quality is. The calendar notification is very random. Sometimes it fires, sometimes it does not. I have missed couple of meetings because notification popped after the meeting was over. Similarly the keyboard shortcut is random. Sometimes it opens the app, sometimes it does not. The laptop also gets very hot if you are not sitting in A/C. Not sure if it is this specific laptop or it is a general issue

Yeah it is the norm these days. I’ve been using macs for 15 years and they bave been fantastic until the very last iteration (after 2016, 17).

My latest MacBook (16") is so unstable that it is actually funny at this point.

So do people just buy these to look cool? I tried using a Macbook many times, but often got frustrated and went back to my good old Linux laptop for development. Doesn't look quite as slick, but certainly gets the job done.

No, I don't think so. There are multiple aspects.

I develop on this thing. It is running a great Unix os. I can't stand desktop Linux. The hardware quality was the best with a wide margin before the latest gen. Battery life is also great. I like them for development work when they are stable.

A lot of people are also really invested into the ecosystem. My entire photo collection is on iCloud. I use an iPhone. I can copy paste between my computer and phone. My Apple watch unlocks the computer when I'm near... List goes on.

But now I feel like Apple is a fantastic phone company that also happens to make some computers. They have been degrading pretty bad.

It's also that Windows/Linux has many of these issues as well. It's not as if Windows 10 notifications are clear and intuitive. When I go to my desktop after a day of work, Windows will slowly replay every single slack message and email I got all day, one at a time, for almost an hour, as single notifications.

I think it's less that OS X is bad now, but more that it's finally degraded to a level of annoyance that people just have gotten enured to with Windows. It's not to say that that's a good thing, but at this point, I have known bugs and annoyances with all of the computers I work with, no matter the platform.

Some of it is also that Apple has a "real" integrated ecosystem. To what you say, you can easily move things between iOS and OS X. If you're watching stuff on your Mac, you can throw it to an Apple TV or your Airpods. Windows doesn't have a version of that that "just works". The closest you get is opting into Google's ecosystem and going Chromecast/Android, but I'd rather not trust Google with even more of my info.

Honestly I'd dare to say most of Apple's market right now is purely from vendor lock-in. Both their hardware and software are getting worse, but not bad enough for people to switch their entire digital lives to a different ecosystem.. not yet, anyway.

My first Mac was an employer-provided MBP in... oh, 2011 or so. Before that I'd used DOS, Windows (3.1 and up, including NT4 and 2K) and Linux (Mandrake, Debian, Gentoo, Ubuntu, roughly in that order with a little Redhat and Fedora here and there). I'd seen some early OSX server edition thing, but not really used it, and I'd used pre-OSX Macs at school (hated them, "it's more stable and faster than Winblowz" my ass). Some exposure to Solaris, too. Used BeOS (loved it) and QNX on home machines for various purposes, as well.

The MBP was the first laptop I'd used that 1) had a trackpad good enough that I didn't feel like I needed to carry a mouse around to use it for more than 10min at a time, and 2) had battery life good enough that I didn't feel like I needed to take my power supply with me if I'd be away from my desk for more than an hour. It had every port I was likely to need for most tasks. In short, it was the first time I'd used a laptop that was actually usefully portable as a self-contained device. They kinda ruined that appeal by going all-USB-C and The Great Endongling, but that's another story.

It was also very stable, and over time I came to really appreciate the built-in software. Preview's amazing (seriously, never would have thought a preview app would make a whole computing platform "sticky" for me, but here we are, it's that good), Safari's the only browser that seems to really care about power use, terminal's light and has very low input latency, it comes with a decent video editor, an office suite I prefer over anything I've used on Linux, and so on. In short it's full of good-enough productivity software that's also light enough on resources that I don't hesitate to open them, and often forget they're still open in the background.

These days I like having a base OS that's decent, includes the GUI and basic productivity tools, and that's distinctly separate from my user-managed packages (homebrew) rather than having them all mixed up together (yes, I could achieve this on Linux, if it had a core, consolidated GUI/windowing system so various apps weren't targeting totally different windowing toolkits, but it doesn't, so separating a capable and complete GUI "base OS" from the rest of one's packages gets tricky). There are quite a few little nice-to-haves littered around the settings and UI. Most of the software is generally better polished UX wise than Linux or Windows, and that doesn't just mean it's pretty—it performs well and, most importantly, consistently. There are problems and exceptions to "well and consistently" but there are so many more issues on competing platforms that even if it's gotten worse, it's still much nicer to use.

Given the premium on hardware (that's come and gone—at times there almost wasn't one if you actually compared apples to apples [haha], but right now it's large) I'd rather use Linux (or, well, probably a BSD, but that'd mean even more fiddling most likely) but the only times that's seemed to function genuinely well and stably compared to its competition was when I either kept tight control over every aspect of the system (time-consuming, meant figuring out how to do any new thing I needed to do that other systems might do automatically, which wasn't always a great use of time to put it mildly) or in the early days of Ubuntu (talking pre-Pulse Audio, so quite a while ago) which was really sensible, light, and polished for a time.

I do still run Windows only for gaming, and Linux on servers or in GUI-equipped VMs for certain purposes.

It's not just Apple, I've got a Mi phone, sometimes reminders pop up hours after they happened. They've mucked around with the default android lockscreen to save power and I think this is causing the problem.

The devices are so complicated now that they cant do their most basic functions right.

> The calendar notification is very random. Sometimes it fires, sometimes it does not. I have missed couple of meetings because notification popped after the meeting was over.

I see something similar and assumed this happens because Mail / Calendar are relying on ics attachments (not sure what the behaviour is with the Gmail integration). I believe this means that if Mail is closed you don’t get Calendar updates until you open both and refresh.

Either way I find I have to refresh Mail and Calendar a lot to keep them in sync.

Heating is either a hardware issue or something like a broken program running continuously - on a normal setup that doesn’t happen.

Calendar / Todos depends on the backend. If you’re using Exchange, check the settings to confirm that it’s not set to poll every hour or something like that.

I had this problem too. Maybe try charging it on the other side. It sounds crazy but it's true:


It's ridiculous how poor in functionality all of them are... The best they can do is, what? Creating schedule entries, for me. All I ask Google is the weather, time, some search when I'm lazy and translate (the voice translate app itself is great btw). Feeling like a total corporate bitch saying "Hey Google" every time, too :D

This is supposed to be a personal assistant. And I have a whole list of what it could do for me, personally. But it doesn't.

I've been trying to figure out how to hook Google's speech recognition and voice into other apps, since they're great and it's 99% of what I need, hands-free control and feedback. Maybe they should make that easy, preferably offline and let other people create their own personal assistant modules or something.

> Maybe they should make that easy, preferably offline and let other people create their own personal assistant modules or something.

Like they would ever do that. Then you would no longer be their "corporate bitch".

Seriously, the one thing that stands between home assistants and being useful is opening the software up and letting it be used by regular OSS devs. Alas, every one of the four big providers (Apple, Google, MS and Amazon) treat them as their moat; they want control over the ecosystem. It's the same in many other places in the industry - we're technologically way behind where we could be, because everyone wants to be the platform and commoditize everyone else, which necessitates having total control.

I'm fairly sure they could still control and monetize such a product. Kind of like Android.

Maybe there's already something like that in the works, with all the talk and investment in AI, we should be seeing some real world results...

I agree, some things are just astonishingly bad given the immense effort and resources that are being put into machine learning.

Microsoft OneDrive tags my photos. It's mostly useless. For instance, I have some pictures of squirrels on the tree outside my window. Squirrels can really do the most amazing things on trees, but they are small compared to the tree.

Microsoft with all its AI muscle will invariably tag those images as something like #Outdoor #Grass #Plant #Tree.

It's the same problem with all of those benchmark beating AIs. They have no clue what's special about the picture and what just happens to be in them as well.

I’ve had Siri disabled for years. Even the basic “call home” works every 3rd time. I try it for a few minutes with every new iOS update only to see it’s still the same dumbster fire.

When I lived in the United States I frequently used Siri while driving to setup the gps or change what music I was listening to. But indeed, ever since moving to a country with a nice public transportation system, I’ve had it disabled. There just isn’t a compelling use case.

Meanwhile, my iPhone has gotten much smarter about adding meetings to my calendar, or guessing the person who is calling me. These seem like the real use cases for AI going forward.

TBH, Google Assistant is not that much better. In the last few months it has become absurdly racist against my Italian accent, replying to me in Italian after I ask stuff in English - and getting the question wrong anyway.

But yes, Siri is the worst.

Are you sure it's due to your accent? That seems like a difficult feature to even implement. I know Google loves to play language shenanigans based on your current IP address.

Mine will sometimes parse English, sometimes German. Sometimes it will understand English, and answer in German. It's a total crapshoot.

I live in England.

Google Assistant failed on me recently trying to set a timer. It correctly understood my "set timer for three-and-a-half minutes" request, says it's setting a timer for 3am instead and proceeds to actually set a timer for 30 seconds. How is there that much disconnect between the stages of the query?!

Screenshot: https://twitter.com/R1CH_TL/status/1252232170237640706/photo...

You consider that racist?

There’s nothing worse than being forced to use a language you didn’t explicitly request.

Well, maybe some things are worse.

Probably at least number 3 behind the Holocaust and Internet Explorer.

Yesterday I was washing the car while listening to Music with the air pods. I mistakenly clicked them and it launched Siri which for some reason it called a number on my phone through Facetime.

I always immediately disable siri because something similar happened the very first week I got my first iPhone. It seems to want to call the one person you haven't talked to in years.

Ha. Same thing for me. Made me rush to the phone as I heard the number it wanted to call.

It isn’t really ridiculous. The number of people that buy an iphone because of Siri could share a pizza.

It may not be the main selling point but it's certainly a factor for many people. And the fact that Siri is so bad makes the whole phone feel less high-quality.

And maybe I'm the exception here, but I have refrained from buying an Apple Homepod specifically because of how bad Siri is. If it was on the same level as Google Assistant I would have bought one by now.

Back when Siri was new and I was much younger this was something I really wanted…

I'm convinced it's actually gotten worse with the 11, I recently went from an iPhone 7 where Siri only really activated randomly a handful of times a year and could set timers (Only thing I ever use it for) with a shout across a room over the sound of an oven and fan.

Now it randomly activates multiple times a week and really struggles to even pick up me talking to it right next to it.

Convinced they've switched to a less capable microphone system because assistants were all the rage in the 7 era but now I think people have realized it's not really that important.

It even sucks at setting timers. I asked to set a timer for 50 minutes and it clearly said 50 minutes on the screen and then “corrected” it to 15 minutes.

For a while, it randomly decided that “call my wife” meant to “call my mom.” It clearly said call my wife on the screen and then switched to “mom”.

You have to know the Siri hacks.

Set a timer for fifty-one minutes or fourty-nine minutes.

Even siri can hear that

Ask it to set a timer for 5/6ths of an hour.

Like others have said - it's Apple's privacy policy, they don't know enough about you and can't use the recordings to train.

I think Apple should have been more honest about it in their privacy messaging -

"Hey guys, pretty please can we listen to your Siri recordings? We know it's not the privacy style you're used to from Apple, but if you want Siri to ever not be a piece of crap, this is really the only way."

I don't buy that anymore. There's a multitude of ways to improve Siri while respecting privacy. Apple doesn't have to go the Google way to improve.

Apple knows a ton about me, they have realtime access to my email, calendar, contacts etc. If they have guarded access, then I would accept a toggle. A lot of Siri processing happens locally on the device nowadays, which could be why the Watch, Mac, iPhone and Homepod all can give wildly different results.

Also, Apple could train it themselves, they possibly do, except we haven't gotten an update yet. A large portion of my personal training data could be stored in iCloud, I mean my passwords, mail, documents and my photos are there, right? The analysis of my voice data is sent to Apple anyway.

I think that up until the real AI appears that can mimic human personality and expand it, understand variety of languages, virtual assistants will remain what they are: virtual assistants capable of executing basic commands provided by a human (preferably in English). This technology isn't bad per se - in particular cases is helpful (blind people) but it's still far for what we dream about.

My Nokia 1320 with Windows Phone 8.1 come with a very basic VA that was capable of understand Polish but only if I drop all grammar and talk like a robot. The "call mum" is "zadzwoń do mamy" in its proper form, but I had to do "zadzwoń do mama" which sounds unnatural; not mention that stuff is also being read without proper Polish grammar; "calling mum" is synthesized as "dzwonię do mama", not "dzwonię do mamy". The grammar complexity is a problem for VA technology and probably that's why neither Siri nor Cortana supports Polish or other Slavic languages, not mention dozen of other languages.

I mostly use Siri to play Spotify, but most of the time she won't listen to me through my car's microphone. If she isn't working then I need to pull my phone out of my pocket and place it somewhere the phone can clearly hear me.

Every once in a while she will decide to listen through my car, but it's very rare. I don't have Apple Car Play so not sure if that's a factor.

Google has been far better and more consistent listening through my car even if it didn't get my query correct all the time, at least I could correct it without pulling my phone out.

I almost never use the "google assistant". However it has an habit of firing up while I'm driving listening to podcasts : "I didn't understand your command" (generally, at a annoyingly loud level, with that). I didn't ask you anything, you stupid. You've just misunderstood a sound coming out of the phone itself, where nobody said anything remotely related to "hey google" or "OK google". Puhlease.

I turn off as many settings for Assistant as I can, and disallow the Microphone permission to the "Google" app. That should help you out.

Siri is utterly useless. Like, I just don't. The only capacity in which I use it is to tell it to open an app on the phone, or set the timer.

Siri is like that nice employe who was hired by way of nepotism, and she's attractive, but she sucks at most things, but the organization won't fire her because of aforementioned nepotism in the organization and the only reason you put up with her is cause she's attractive and she, at the very least, makes coffee and makes photocopies just good enough, but you can't trust her with more advanced tasks.

What's worse is that the organization also won't hire her more talented and equally attractive contemporary Google Assistant because of aforementioned nepotism. The boss thinks there's only room for one assistant.


Same deal for me but with Google. I had to turn it off because it kept activating itself, despite voice training it to apparently just my voice. I'd be driving in the car by myself, playing music (via BT, from Google Play, on the same phone!) and it would randomly pause my music and give me an assistant prompt (after, I assume, hearing something in the song that it was playing itself that sounded like OK Google). It was absolutely infuriating and would come up a few times a week.

Still? Siri has actually gotten worse.

In the last year or so it's gone from correctly handling "add red salsa to the shopping list" to consistently adding two items "red" and "salsa". (It also fails on "buttermilk" and others.)

And around the time this started happening, Siri went from acting after a short pause to saying "just a sec" after a short pause.

Perhaps it's time to file a feature request to Apple to allow us to plug in alternative digital assistants in place of Siri.

"Timers"? You mean: the timer. You can still only set one timer. Yet another deficiency. Its like the people who work on iOS don't ever cook multiple things at once.

I get the accidental invocations about once a day, because my dog is named Maisie and apparently that's just close enough to "Hey Siri".

Aza Raskin's Ubiquity was such a clear model of how to build voice interfaces the right way, and it wasn't even a voice interface. It was a bit of a launcher that tied APIs together on the web.

Let users create and share small commands. Create a simple natural language for commands that are easy to program, extend, and remember, and narrow the scope of inputs the voice engine has to deal with.

It was so beautiful and effective and just light years ahead of what we're getting.

Microsoft gets a special mention for lost potential here. Their voice system in Windows could be a way to navigate the layered menus of the OS, but it is mostly focused on answering general queries. Voice is a great replacement for the program launcher, except it's not customizable, but that's about the extent of how much you can control the system with it. Let me do anything buried in the control panel, show me everything you know about a process when I ask, solve that first, then worry later about telling me how big the moon is. You make an OS, don't forget what that is.

Once I had a new assistant and I asked her to book me a flight to Boston. She went to the travel booking system, typed in Boston, then called me back confused. "Which Boston do you want? There are 8 of them?"

I was caught off guard and told her I'd prefer the one in Massachusetts. I did not fire her. She was young, had a poor general education, and had never traveled outside her home state. Those things do not make her stupid.

This is how people treated Siri when it came out. It's been nine years since, so it should really not be something that is OK now.

(Aside: are there actually people who grew up in the United States who aren't aware of the significance of Boston, Massachusetts, arguably the nexus of the American Revolution?)

If she were still doing that a decade later what would your response be?

We choose to hire Siri/Alexa/etc. At some point there's to be a baseline of good enough or else you fire them.

Yeah, point very much taken. Agreed.

I mean, you can become smarter. She does sound stupid, but fortunately, maybe she can change that.

Note that she didn't book you a flight to the nearest Boston either.

Thanks for this. The tone of this article was off-putting to me. The world doesn't really break down into "us" and "stupid people."

For a related example, type "11:00 EST to UTC" into Google and DuckDuckGo. Google says 15:00 because it interprets "EST" colloquially as "the current time on east coast US". DuckDuckGo says 16:00 because it interprets EST literally as "the time on east coast when it isn't daylight savings time" (compared to ET or EDT). It isn't clear which behavior is more desirable.

I certainly prefer DDG here. There is no ambiguity in "11:00 EST to UTC". EST is always UTC - 5, EDT is always UTC - 4, and ET could mean either depending on the time of the year. Google isn't making an arbitrary decision to deal with ambiguity it's ignoring the specificity in the query and reinterpreting it (as ET rather than EST)

I'd argue that DuckDuckGo's behaviour is the correct one, but they should display a warning when the east cost is on EDT and someone requests conversion with EST. Maybe we will stop using daylight savings before the masses learn the difference between ET/EST/EDT.

Edge case: what would you do about Mountain Time? Check out this "fun" time zone map of Arizona where you could fly a straight line through 6 time settings in one hour: https://en.wikipedia.org/wiki/Time_in_Arizona#/media/File:AZ...

The ET/EST/EDT question is reasonable because ET is heavily populated and behaves sanely (as far as I know). Other popular US time zones are... different.

Interesting edge case. Perhaps we need something other than MST for Arizona (excluding Navajo Nation) if we want a lack of ambiguity. If only politicians knew how annoying dealing with time is for software developers!

The other day I was on a livestream's chat where a Londoner wrote "it's 11pm GMT here in London!", which I noticed was wrong because it was 10pm GMT (and it was 11pm BST).

If he scheduled a call for 15:00 "GMT", he would call in at 14:00 GMT instead and wondered why the international callers aren't showing up.

So the kind of "automatic fixing of ignorance" Google does could be annoying and ruin some stuff.

I don't understand, 11:00am EST is 3pm UTC? Isn't it correct?

An actual assistant would have a lot of context. Do I know that you're going to travel to London, UK for a break next week and therefore I would naturally assume that you are interesting in the time there.

However, are you planning to visit your parents in London, Canada this weekend? Then an assistant who would still answer with the time of London, UK would maybe also not be the smartest?

So really context is everything and making broad statements that if an assistant was to answer with anything but London, UK should get fired is something that someone would say, who IMHO should get fired. shrug

Also, IMHO, if someone doesn't know that machines don't have human context and therefore doesn't know to ask their digital assistant "What is the time in London, UK" when they want to know the time in London, UK, then maybe they should get fired from their tech job. shrug

In the absence of context, the best answer is London, UK.

You actually need to imagine additional context to make any other answer plausible.

If you hear the sound of hooves clip-clopping nearby, you think horses, not zebras.

Seems like there is context though: location.

In Western PA, if I hear someone talk about Indiana or Washington, I'm inclined to think of the counties/boroughs first, because they're closer. If someone says they're going to college at Cal, I'd think of California University of Pennsylvania before the University of California, because that's one of the more popular state schools in the region and a lot of people go there.

London, Ontario is a weird case because it is closer, but not close enough to be the better answer.

In the absence of context there is no best answer. You need context to know why London, UK is much more likely to be the right London "without additional context".

> In the absence of context there is no best answer.

sigh why do people always feel the need to redefine words?

By any definition of the word, "best" is correct term to use here.

Here's a few definitions of the word "best", please and in all honesty, tell me why the use is wrong given the following:

• best: "In the most excellent or most suitable manner; with most advantage or success: as, he who runs best gets the prize; the best-behaved boy in the school; the best-cultivated fields."

• best: "In or to the highest degree; to the fullest extent; most fully: as, those who know him best speak highly of him; those best informed say so; the best-abused man in town."

• best: "Of the highest quality, excellence, or standing: said of both persons and things in regard to mental, moral, or physical qualities, whether inherent or acquired: as, the best writers and speakers; the best families; the best judgment; the best years of one's life; a house built of the best materials."

So if there is a finite set of possible answers and a set of criteria that establish a metric to turn this set into an ordered set, there is by definition a non-empty set of best answers. The context-free metric for ranking cities is global relevance: https://en.wikipedia.org/wiki/Global_city

And now please look at the number 1 spot of this list.

Thanks for reading.

Yes, but that set of criteria to establish a metric is context.

No, it isn't - see

> The context-free metric for ranking cities is global relevance: https://en.wikipedia.org/wiki/Global_city

There is no context whatsoever in global relevance.

If you ask someone what time it is, you don't get asked "where?" in return - the current location is implied. Same with cities - if you mention London, it is implied that you mean the most globally relevant London, not the one that was named after it.

The context is being an ordinary human being in the world. I think a significant chunk of people _in_ Ontario would expect the one in England.

Sure, but maybe a reasonable default context would be "a human currently living on Earth is asking the question."

In game-theoretic terms, London, England is the Schelling point [1] of the set of places named London on Earth.

[1] https://en.wikipedia.org/wiki/Focal_point_(game_theory)

But if none of these contexts you mentioned exist (which is the case here), what would an actual assistant answer?

There was minimal context, location data from the watch. Maybe even the fact that there's a pandemic and he ain't gonna go to London, UK, but possibly Canada.

However you spin it, someone will find a reason to find the answer "stupid", so the only one who is really stupid is the person who fails to ask a concrete question if they expect a concrete answer.

If your company is located in Paris, Texas and you are in New York, ready to embark for a business to Paris, France and that you ask your assistant over the phone, out of the blue, "what is the temperature in Paris?" they would probably ask "which one?".

Isn't half the point of these digital assistants that they do have the context?

I've got no idea about Siri, but the android one ties into your Google account to get your calendar and mail so it can get context about up coming travel etc.

See my answer below, there was context and still there is no right/wrong answer. If you expect a specific answer, ask a specific question. If you fail to do so, then maybe that is the stupid thing which happened?

ha! I was wondering how hard it'd be to find someone making an appeal to "context". While you're not exactly wrong, you are. Parsing meaning is something humans are surprisingly good at, and trained for.

I think the most damning part is how, at the bottom, he list a handful of other "smart" assistants which correctly list London UK's time... for now.

But, his point about consistency and slowness is exactly why I never use these shitty voice assistants. If I'm going to be interacting with some pedantic robot, I generally want to be able to edit the text of my request.

TLDR; voice assistants suck.

> he list a handful of other "smart" assistants which correctly list London UK's time...

It's only correct if that is what you secretly asked for. If I have never travelled to Europe and I am planning a trip to London, Canada then in my subjective world I would kind of be disappointed when my digital assistant told me the time of London, UK. That person would have equally zero understanding, like "WTF SIRI, why would I want to know some city somewhere I don't even know where it is located on the map when you (know) that I often go and visit Canada. Gosh you stupid idiot assistant!"

A London, Ontarian would probably expect/accept the confusion. Because if he's travelled and met international people and they asked where he's from, he would've specified ", Ontario", otherwise they would've all assumed he hails from where Sherlock Holmes lived.


Same thing happens with Frankfurt. If I order something from Amazon.de, my package usually goes through Frankfurt. When I checked my package tracker app on my iPhone, I was surprised to find out that Frankfurt is actually on the Germany-Poland border.

Turns out that if I enter "Frankfurt, Germany" into Apple Maps (which I assume is what the package tracker app does), it takes me to "Frankfurt (Oder), Germany" instead of "Frankfurt am Main, Germany".

My company (based in Frankfurt am Main, Germany) had an online business travel reservation system which helpfully pre-filled the starting point of all new trips to Frankfurt, Kentucky (which is not even the right spelling of the city).

I filed a request to NOT pre-fill the starting city to a place in Kentucky and got a polite but firm reply that this was the default list provided by the 3rd party online booking engine and cannot be customized. Sigh.

You might need to politely but firmly ask them to escalate this with the 3rd party provider, especially depending on your footing and monthly spends

According to Wikipedia Frankfort, KY doesn't even have an airport so this is a double whammy of stupid on the part of the 3rd party

I had forgotten about this until just now but it wasn't so long ago that the Virgin Atlantic (a London, England-based airline) would autocomplete "London" (vs LHR) to "East London, South Africa" in the origin field despite them not flying there. That annoyed me around once a week when I used to fly a lot more.

If you ever cross the border from Frankfurt/Oder, the first thing you see in Poland is a giant Amazon building. Despite that building being there, Polish people still can't order at Amazon.

> Polish people still can't order at Amazon.

That is not true. Amazon.de is even available in Polish, with free shipping for €39+ orders.

We can order from amazon.de, it even offers localized UI and (poorly auto-translated) product descriptions.

But yes, I think it's likely that an order from amazon.de goes trough Frankfurt am Oder, as it probably originated from one of many Amazon warehouses in Poland.

Polish people (in fact, people all over the world) can order on Amazon.de, they just can't get the same shipping rates that German customers get.

Ah yes, this is funny, because shipment from a warehouse in Poland to a customer in Germany, placing an order on amazon.de is considered "local", while shipment from this same warehouse to a customer in Poland is "international".

Before COVID there were rumors that Amazon may officially enter Polish market this year with state-owned Polish Post as the local shipping partner.

Fun etymology fact: Both Frankfurts get their name from a simple description of the same thing, which they both are instances of: settlements adjacent to a Frankish river ford (place where the river is shallow enough to cross without a bridge). The English "ford" and German "Furt" come from the same root; hence Francoford / Franken-Furt / Frankfurt. The "an der Oder" (at the Oder) and "am Main" (at the Main) suffixes are clarifications to describe which river is involved.

A similar thing happened with some football fans that made it to "Frankfurt (Oder)" instead of "Frankfurt am Main" for the Europa League's semifinals.

Judging from the published pictures [0], it looks like Google Maps may lead to the same result as Apple Maps for "Frankfurt".

[0] https://talksport.com/football/529808/benfica-fan-wrong-fran...

Searching for "Frankfurt, Germany" in Google Maps shows me Frankfurt am Main now at least. Hard to say if that's been changed at some point.

I remember Deutsche Bahn's website used to do the same with Freiburg and Freiburg im Breisgau. Almost made that mistake once.

It's a massive difference. Frankfurt am Main is one of the largest cities in Germany. Frankfurt (Oder) has a population of 50000. You seldom refer to Frankfurt (Oder).

Haha! I actually flew into the wrong airport because of this. The two airports are about two hours apart by car.

A while ago I videoed three particularly silly Siri bugs:

• Not knowing Billie Eilish on some devices only: https://www.youtube.com/watch?v=MMkZGO5iFKw

• A fascinating interpretation of two and a half months: https://www.youtube.com/watch?v=Giq7bQl-jk0

• Playing the wrong song for no explicable reason: https://www.youtube.com/watch?v=2yj6rroaXL0

They really need to get a handle on this!

Amazing examples. Thanks for the videos.

I just asked Google "How many cities named London are there". Google told me that there are three cities named London in America. It completely ignored London, England as well.

I just bought a Homepod because the price has been reduced to a more normal amount, I think it started at £319 so it being £200 in the UK seemed quite good. I have to say the sound quality seems average to me - I admit it produces a lot of sound and bass from it's small frame but I like a bit of top end sparkle and it basically seems to have zero top end at all.

As for Siri there is no point in asking it questions or talking to it for anything but changing volume and setting timers. A smart speaker this is not.

I suppose Apple need to stop being afraid and build a damn search engine that actually understands queries from real people. I mean whatever they are trying to turn queries into responses is simply not working.

> no point in asking it questions or talking to it for anything but changing volume and setting timers. A smart speaker this is not.

For what it’s worth, that’s what anyone uses any “smart speaker” or assistant for. Settings timers/alarms, asking for the weather, and maybe starting music.

Whether Siri is capable of more is a good question, but people don’t use them for any more.

They only use them for that because it sucks at everything else.

I'm talking about all 'digital assistants' - Google's and Amazons. All people use them for is timers and weather.

I have a Sony UHD TV with Android which comes preloaded with Google Assistant which support voice search. I wanted to watch a movie and I did a voice search for that movie. I expected to get a list of apps where I can watch it - Netflix, Amazon Prime, Disney+. Instead I was presented with a list of shops few miles away. I found this interface very bad. I search on TV to watch a series/movie. Why would the Google Assistant give a list of shops nearby? That may be useful if I open Google Maps and search for that but on TV most people search to watch something on TV and choose the app that streams it. Why do smart companies like Google create such pathetic interfaces.

Likely because Google Assistant did not know it was running on a tv. It's probably the generic assistant that comes with Android. Not sure I would consider that pathetic. Tuning all the intricacies of the Assistant for each application seems like a ton of work.

Not necessarily on topic but my biggest gripe with voice assistants is that they work like a web search. You can ask what the state of the world is NOW but you can get them to let you know when things happen. I want "Alexa, let me know when XXX is having a concert nearby" or "Alexa, tell me when XX is on sale". That's what I really want.

Things like this would be killer. These are things a real human assistant could easily do and are deceivingly difficult for a computer.

If you type "16 oz in cups" in duck duck go, it currently gives 15 imperial fl oz is 1.894 US cups. If you change it to US fl oz, it is then 1.972 US cups, when most people would expect 2.0 US cups. This is because there are US nutritional fl oz, and US customary fl oz: https://en.wikipedia.org/wiki/Cup_(unit)#Legal_cup. This sifting through of different "oz" is highly contextual, but Google gets it right.

Siri is particularly dumb in this regard. Even something as simple as ‘play again’ will not always simply repeat a track. The inconsistency is infuriating.

There are many more examples. Apple demoed the ability to ask for songs by asking Siri to ‘play that song from Top Gun’, which doesn’t work anymore.

On a somewhat related note, DuckDuckGo can be particularly bad at local search. I live in Ireland and country search is simply broken returning Australian sites over Irish sites. I have to qualify every search with Ireland or Dublin to get it to be anyway useful.

I have a HomePod and we're getting increasingly good at pronouncing French and Dutch artists with the accent of someone with no conception of foreign languages in order to get Siri to play them, e.g. "Gene Ferret" for Jean Ferrat.

Apple Maps will happily butcher Dutch and French streetnames to the extent that we cannot recognize them at all.

Siri will just make a random guess if it can't understand what I asked it to play. It's just so supremely awful at almost everything.

I've read that Siri collects/sends home significantly less information about the user than its counterparts. It might just be that "getting it right" requires a complete violation of user privacy.

I actually am perfectly fine with this.

Duckduckgo got it right, and it doesn't send any user data. Also, I think a system shouldn't have to know anything about you to know that you're not asking about London, Canada.

Ah you may be right.

Personally, I'd give it all of my information short of financial/state ID/name and maybe accurate location if it means having a real personal assistant.


In a short period of time companies with databases of human activity will be finding new opportunities to profit off this data. Your insurance premiums, credit rating and even potential relationships will be influenced by a third party. When all you have to do is say 'Hey Siri what time is it in London, UK'.

Me: "Hey Google, set a timer for 15 minutes."

Google: "Created a timer for 15 minutes, how long do you want it to last?"

There is also hilarious things going on with Google assistant on iOS. Like Norwegian with English pronunciation.

These voice assistants are terrible. Sure, they are better that before, but still really bad. And it's even worse in other languages than English where stuff like the above happens.

I’ve always phrased the question as “What is the time in the UK” (as a British person living in the USA). It’s unclear to me whether my form of asking the question is because it’s the best way to get a “good” answer from Siri, though I don’t think it would ever occur to me to ask what the time is in London versus the last city I lived in (Bath) in the first place, since they’re always the same.

Maybe because you know that there's only one timezone for the UK, so there's no point asking "What's the time in London" and "What's the time in Preston" because you know the answer is the same, but since America has 3 or 4 timezones (I'm not sure, I know it's at least three), American people intuitively specify the city, since "What's the time in the USA" is not valid

> Maybe because you know that there's only one timezone for the UK

And that's where things start to get fun, because there are actually several timezones in the UK if you include its dependencies. I don't really know how it works in the UK because I'm French, so let's take France instead. "What time is it in France?" usually means "in metropolitan France", but now let's say you're in northern Brazil, close to the border of French Guyane. When you say "what time is it in France", do you mean "metropolitan France across the ocean", or do you mean "the closest French department a hundred miles away"?

>> And that's where things start to get fun, because there are actually several timezones in the UK if you include its dependencies.

I think the UK only consists of England, Scotland, Wales and Northern Ireland. The UK has some control over aspects of the dependencies but they are not actually part of the UK.

Yeah, the crown dependencies are distinct sovereign states.

France is indeed the best/worst example as it is the current country spanning the most timezones (13).

Le soleil ne se couche jamais sur la République.

What’s funny is that if I ask Siri what time it is in London, it gives me the same time as it is here in Waterloo (1hr east of London Ontario).

On the other hand, if I ask what time it is in Preston, Siri gives me the time in the UK, despite the fact that I’m only 15km away from Preston Ontario!

Now if I ask what time it is in Cambridge, it gives me the local time instead of Cambridge England. Preston Ontario is actually part of Cambridge Ontario.

So what it seems to be doing is picking the nearest place that’s well known, rather than the most well known place with that name. Preston Ontario isn’t really known at all unless you’re from this area. Cambridge Ontario is a little bit more well known, though still a far cry from Cambridge England (likely due to the university).

Ha, what's funny about that is I picked Preston as a random town in the UK that I thought probably doesn't have a name-twin somewhere else!

There's also Preston, Maryland in the US - finding UK town names which are not repeated elsewhere is hard!

It also makes driving around New England most confusing, since towns which _should not_ be in the same direction often are...

I just spent longer than reasonable zooming into random parts of Canada, it's quite funny seeing places like Morpeth south of London. And also satisfying to see Kent and Essex are neighbours in Canada, too. There were some surprising ones too, like Uttoxeter having a name-twin

It's fairly established that picking a capital / large city is the correct way to specify a time zone, that usually gives you what you want. The alternatives have various problems:

* Pick a country: Some counties have multiple time zones.

* Abbreviations like EST, CET: Not right in summer.

* Words like "Eastern Time": assumes the country from context.

* Offsets like UTC-5: Doesn't follow summer time.

* Click on a map: India will ban your app because one pixel in Kashmir gave Pakistan time.

I'd suggest that the correct way to specify a time zone is the name from the IANA time zone database, which uses area (currently a continent or ocean) and location, avoiding most of these problems. Thus:

- America/New_York

- Europe/London

- Indian/Mauritius

- Pacific/Chatham

> * Words like "Eastern Time": assumes the country from context.

This is a big bug bear of mine that seems to come up a LOT from Americans online. Eastern fucking what? Australia? Anglia?

>If you had a human assistant and asked them “What’s the time in London?” and they honestly thought the best way to answer that question was to give you the time for the nearest London, which happened to be in Ontario or Kentucky, you’d fire that assistant.

An obvious sign that you would be a shit employer to work for. OP is probably exaggerating here (at least I hope so), but if you were to fire someone over this without educating them in the ways of how your mind works to create better efficacy between you two, then you don't deserve employees.

Context...means...everything. People make the most basic of mistakes all the time. Teach them the preferred outcome, and move on. Making a fuss over it shows your lack of maturity and ability to lead anyone.

Reminds me of kids who are in the "in" making of kids who aren't. "What??? You've never seen Star Wars??? GUYS! Timmy's never seen Star Wars! I bet he doesn't even know who Obi Wan is. What an idiot..."


> If you saw someone write something like "If you had a human assistant and asked them 'What’s the time in London?' and they honestly thought the best way to answer that question was to give you the time for the nearest London, which happened to be in Ontario or Kentucky, you’d fire that assistant." you'd be quick to call them out as an "obvious" bad employer.

No, really, this is just a tired nitpick. I hear a lot of complaints about Hacker News on other websites, about how the commenters are too nitpicky and "Actually, …", but I usually don't mind or actually appreciate many of those. This particular one really annoys me. Everyone knows that people make mistakes all the time, and clearly you shouldn't fire them for simple ones. But that's not the point that was being made! You just sitting here claiming people are obviously horrible because you saw the word "fire" is just not productive at all.

I think the point translates just fine.

What I read was, "Siri doesn't give me exactly what I want by perfectly understanding my own personal bias that London means, London, England, so I am going to post a blog complaining about it, while simultaneously cracking a joke that I would terminate someone's employment if they made a basic mistake." Which I took to be pretty unproductive conversation.

Still, point made. OP would be terrible to work for and Siri could potentially use some improvements in accuracy.

> Siri doesn't give me exactly what I want by perfectly understanding my own personal bias that London means, London, England, so I am going to post a blog complaining about it, while simultaneously cracking a joke that I would terminate someone's employment if they made a basic mistake.

At some point, an assumption that an extremely large majority of people would reasonably make (I am hesitant to claim the number of "nines", but would guess it to be at least three) stops becoming a "personal bias" and a and more of "unless you were raised by wolves or some sort of homeschool for math savants you really ought to know what I mean". If I'm talking about Springfield and you take it to mean I am discussing the Simpsons and not one of the dozen Springfields that exist in the United States, sure, that is an honest, "basic mistake" (although I would prefer if you clarified the moment you guessed that something might be wrong with your interpretation). Thinking that I really meant the tiny town of London, Ohio that I have never mentioned and more than likely probably don't even know exists when I say "London" instead of the massive metropolis that is the capital of a major world country is just being stupid: something is very off with your sense of context. (To finish the argument: if I said "London" and you assumed "London, England" but I actually did mean "London, Ohio", I would be extremely forgiving, to the point where I would lay the fault on myself for not clarifying when I gave such a misleading view.)

> OP would be terrible to work

Because they say they're not interested in employing people who cannot understand more context than a reasonable human being needs to quite literally live and interact with others?

> Siri could potentially use some improvements in accuracy

Siri could absolutely use some improvements in accuracy.

I'm with the author here. There's a certain amount of base knowledge that not having is just indicative of general cluelessness.

It's like if you asked someone to get you a cup of coffee and they came back with a measuring cup of coffee beans, it's not that you can't educate them that you actually wanted them to grind the beans and filter some water through it. It's that they're going to screw up lots of other stuff too.

You probably wouldn't fire them after the first time, but do you really think someone who makes that mistake won't just be more trouble than help to work with?

>You probably wouldn't fire them after the first time, but do you really think someone who makes that mistake won't just be more trouble than help to work with?

I've worked with some clueless people. All of which were capable of learning with some instruction and guidance.

Had a cameraman once forget to press "record" at the beginning of an event - nearly costing me a client. I was pissed at the ignorance of the mistake, but I ate a lot of the frustration, mentioned that I need him to occasionally ensure the camera is on, and recording during events and...it never happened again.

I wouldn't fire someone for bringing me a cup of coffee beans instead of a brewed cup of coffee. I would let them go if they kept doing it after being provided instruction.

As an aside, I never ask anyone to make me coffee unless they are a trained barista, and even they make drinks that aren't what I want on occasion. Though I see your point, there is way too much preference in how to make an ideal "cup of coffee" for this to really be comparable, in my opinion.

That's not comparable at all. People make mistakes, they forget; if I asked someone to bring me a coffee and they brought me the beans because they had a brain fart, I might laugh and be OK with it. If they consistently brought me beans and water, I'd think they were being malicious, and if they brought me beans and water, even though they know what a coffee is, and genuinely acted clueless about it, I would be seriously annoyed. If you're going to bring up "maybe they didn't really know what a coffee was": no, that isn't the point.

> As an aside, I never ask anyone to make me coffee unless they are a trained barista, and even they make drinks that aren't what I want on occasion.

Actually, I don't even drink coffee! I fail to see how this is relevant to the discussion at all.

>I fail to see how this is relevant to the discussion at all.

The example OP provided is about how asking someone to bring you coffee can produce unexpected results. I furthered the point that it can still happen with trained and experienced baristas - people who make coffee for a living.

If someone is being obviously and intentionally malicious by doing a task incorrectly, then yeah, let them go. You'll both likely be happier for it.

> I furthered the point that it can still happen with trained and experienced baristas - people who make coffee for a living.

You've had a trained barista give you a cup full of unground coffee beans when you asked for a cup of coffee? I find that hard to believe.

Did you understand, or even finish reading, the article?

You utterly and completely missed the point. There is no actual human getting fired, or even the threat of someone getting fired. There isn't even a "someone"!

The point is to highlight the incompetence of Siri the virtual assistant, not to imply that someone needs to be fired. It's an analogy[0].

0: https://www.merriam-webster.com/dictionary/analogy

We all understand the point being made: this is a ridiculous answer by Siri.

The treatment of a hypothetical human-assistant-that-never-was is really irrelevant.

I didn't see any New Englanders in this thread point out this sign in Maine yet [1]. I grew up in northern New Hampshire and many the towns near me were named after European places like Berlin and Milan (purposefully mispronounced after WWII). Much of the south part of the state is named after cities in England: Manchester, Dover, Portsmouth, etc. Close by is Portland, Maine, which is always confused with the one in Oregon unless you've lived there.

This is one of those AI contextual things which have no great solution until the machines can read our minds, or get a lot better at learning about us. Then, of course, privacy advocates will lose their minds.

1. https://upload.wikimedia.org/wikipedia/commons/1/1f/China_Si...

Search relevance is incredible difficult, especially when there is not enough information provided in the query and there are multiple "correct" answers. Someone is always going to be disappointed. You can make the argument in this case that +90% of users would find that London, England is a better answer, but there are millions of these types of queries.

In a recent example, I was using my favorite search engine and showed someone a query for "toaster" where the top results was an infobox describing what a toaster was. They thought the relevance for the query was hilariously wrong saying, "everyone already knows what a toaster is, I want results for buying a toaster".

IMO I thought the results were perfect based on the query. Neither of us were wrong.

The expectation for search tools has dramatically shifted where many people expect them to have oracle like qualities.

> Search relevance is incredible difficult, especially when there is not enough information provided in the query and there are multiple "correct" answers. Someone is always going to be disappointed.

Sure, but in that case you go with the answer that will be the correct answer for the most users. Which in the example in TFA would almost certainly be London in the United Kingdom, not some other London in Kentucky or Canada or what have you. You'll end up disappointing those users who actually want to know the time in those Londons, but there are going to be a lot fewer of those.

I agree, this is a fairly clear cut example.

But what do you do in situations that are closer to split on people agreeing what the right answer is, or there are 10 answers that could be relevant to the same percentages of people.

A query leading to the single relevant answer to the user 100% of the time is impossible.

There could be a mechanism for the system to get additional information when needed, but that also has some user experience issues.

These are all problems that are most likely solvable to a degree though.

On a side note, I also don't think Siri is very good compared to some of the other voice assistants.

Siri got it wrong and you got mad, we get it.

The childish rant about firing assistants is unnecessary.

Is it? Siri is a digital assistant. The point is it should be held to a much higher standard. It's currently awful to the point of uselessness. So what standard of assistant is it? Not one you'd employ.

I think treating Siri like a digital version of a Personal Assistant is going a bit far. It is a simple Voice Interface to a bunch of apps, local databases and cloud databases.

Nobody has ever said "Should I buy an Apple Watch for $750 or hire a PA for $35,000/year?".

I don’t think asking the time in London is asking too much of a VA in 2020

I would expect a 10 year old core feature on my $1750 phone / watch combination to perform better than my 3 year old.

I'm not a parent, so just speculating wildly, but there's a good chance your 3 year old has already cost more that $1750. Also, having met a few in my extended family I would doubt that a 3 year old not from London would actually know the time there, even if they could properly disambiguate the question. The phone seems like good value by that comparison

Google demoed an AI that could call the barber and make an appointment for you about a year ago.

I think it shows a lack of imagination on your part to NOT make that connection.

> lack of imagination on your part to NOT make that connection

I hope you didn't fire your PA after that AI demo!

I think it’s an exceptionally American response.

It reminds me of Gamers Online that demand for people to be fired whenever the slightest mishap happens.

You'll be relieved to know the assistant in question doesn't really exist. No-one was harmed in the making of this post.

That's AIcist :D

I'm too lazy to clarify my thoughts, you're fired.

Too lazy to clarify by "London" you don't mean London Ontario? Give me a break.

If you have to constantly "clarify your thoughts" because your assistant is incapable of basic inferences based on context and general knowledge, then they aren't likely to be much use as an assistant.

Maybe GPS data and the fact that due to a pandemic there's a travel ban the actual intelligent answer is in fact "London, Canada" and only a stupid person would fail to see that?

We can spin this all day long. The truth is, if you want a very specific answer, and you fail to ask a specific question, then you're the only one to blame.

How far would you take that though? If I say to my assistant "what time is it?" should I also specify that I mean the time at my current longitude and latitude. If I say "Could you please print this for me," must I specify to print it on the printer in my office and not one in Barcelona, using letter paper, in a font that isn't Comic Sans, and that it shouldn't be thrown in the trash right after it is printed. Or should I assume that a reasonably intelligent person can infer all that?

I see a lot of people here making references to the pandemic but as far as I remember this behavior hasn't really changed since last Year before the pandemic. John Gruber presumably can't travel to either London right now anyways so choosing the nearest London rather then the most known London isn't particularly smarter just because there is a pandemic.

Also I personally ask a question like this (and I assume many people do as well) if I want to call someone which is notably unaffected by the pandemic.

I think dates, times, UTC offsets, and locales/cultures is a topic we frequently think of as "that's easy" [1] when in practice it's painstakingly hard to get right.

As an example, we've spent the past few days on our eng team refining our spreadsheet functions for date/time handling, and it's like the 5th time we've iterated on this (after supporting everything Excel / Google Sheets do).

Funny part is, I'm sure we'll iterate on it even more -- it's hard to get this topic both right & make it easy to use / approachable.

Btw, does anyone have good reading materials on this topic? (date/time/locale handling)

[1] I'm biased as a founder at https://mintdata.com, but thankfully our engineers set me straight on the subtleties :D

>Btw, does anyone have good reading materials on this topic?

I had a slide deck somewhere from when I was at a broker trader and leap seconds mattered (they (can) happen around 10am in East Asian markets on Jun 30).

The moral of the story is: time measurements are fractally wrong, it doesn't matter what format/time system you pick, there will be a use case that breaks it badly. Local time + timezone, epoch, UTC, ATI, doesn't matter it will break somehow.

Your MintData software looks great. I would love a reasonably MS Access type app like this priced at a level I could recommend to or use on behalf of family members.

I hopefully and naively clicked pricing ;)

Some honest feedback would be to rename Personal. Personal account usually mean individuels, but no home user or hobbyist is going to pay $95/month unless they are making money from it.

Any developer who thinks dates, times, UTC offsets and locales/cultures is "easy" is grossly incompetent (unless, of course, they've never worked with human-facing software before, in which case their naiveté can be forgiven).

“Human facing”? Any software worth it’s salt needs to deal with this stuff. Unless it’s a toy program. I quite enjoy Timezone stuff, perversely. Might be Stockholm?

If you're writing firmware for an embedded device without a clock then you'll never have to deal with this. If your software only ever has to interact with other software then you wouldn't ever use time zones unless you really have to. Only humans demand time zones.

I think you underestimate newbie developers. The nightmare that is dates and times is something that you learn from experience, not at university.

Yes, I should say "experienced developer".

It’s the typical example of the meaning problem. Any token derives its meaning from a context. Think of the context as requirement of setting up a careful experiment to measure the spin of an electron. Without the context, we have reasonable confidence that it’s an electron but don’t have any clue if it’s spin up or spin down.

Similarly in this case, we’re reasonably confident that London is a place on earth with a property of time that is being asked. But without the additional context (or supplementary logic) we can’t know for sure which London it is: Canadian, American, or English.

Until we provide such context, London is in a superposition of all the possible meaningful state.

There's a lot of context that goes into this question. Almost all grown-ups understand time zones at least enough to know that somewhere far away (at the other side of the equator or longitudinally) is likely to have a different time of day at any given time. On the flip side, we understand that somewhere sufficiently close, like the next town over, is at least in general unlikely to have a different time of day. So for an intelligence (artificial or otherwise) to choose between London, a few kilometres away, and London, UK, would depend on at least these factors:

- How important is either London in your life? If you commute to London, Canada, then it's going to be infuriating to always get answers about London, UK.

- Is the answer for London, Canada different from "What time is it?"? If it's the same, then there's at least a fairly good chance that you knew it was in the same time zone, and you didn't intend to ask about it.

- Asking about the time is fundamentally different from asking about a whole lot of different things. Let's say you ask for the biggest manga book shop in London, which London depends on whether you're currently near any London, whether you've been to any specific London before, whether you've got tickets to go to a conference in London, whether London near you is even big enough to have a manga book shop.

All in all, no, it's not obvious that "London" always means the globally most important London.

Even us folks who live in Ontario refer to London in Canada as "London Ontario" when talking about it, unless it's extremely contextually obvious that you _don't_ mean London, UK.

So yes, it is obvious to us humans who have London Ontario in our lives that "London" refers to London UK.

I'm pretty sure there are plenty of people living close to London Ontario who refer to it as simply "London", but that's besides the point. We are nowhere near being able to algorithmically take into account everything every single person considers "obvious" in AI. Software today, whether it's marketed as "AI", "expert system" or other IMO bullshit term, is absolutely nowhere near what a non-developer, non-marketing person would describe as such.

> I'm pretty sure there are plenty of people living close to London Ontario who refer to it as simply "London"

Have you seen this comic by chance? Feels relevant here...


But go ahead and keep arguing.

I grew up in Lucknow, India (a tier 2 city with around 3MM people at the time) and had multiple assistants think I meant Lucknow, Ontario — a town with population of a 1000 people.

Having been to London, Ontario, I can confirm that it also doesn't have quite the same heft as London, England :)

Reminds me of a bug I kept having to resurrect at Spotify: the first result for "Billie Jean" was an admittedly good cover by The Civil Wars, but the result was clearly wrong. And it wasn't a case of "Man Who Sold the World," "Hurt," or "Girls Just Want to Have Fun" where the cover is arguably the definitive version. Turns out search is hard.

Oh, and if you search for "Billy Jean?" Still broken.

At some point I wonder if business incentives are a problem

Eg does Spotify pay more to play the original than the one by The Civil Wars?

If so, how much money are they saving by such "bugs"?

And if not, how is popularity (use an objective ish metric, eg wiki page length) not a major feature?

The problem here is context. I can’t imagine an algo that would get it correct 100% of the time on the first run. An actual assistant would learn from and adapt to you. Making a mistake is ok but you can’t repeat the mistake over and over. If you meant London, UK you should be able to say: “No Siri, I meant London, UK.” And that would be the last time Siri is wrong here. Sadly, this is not the current state of AI.

>You wouldn’t fire them for getting that one answer wrong, you’d fire them because that one wrong answer is emblematic of a serious cognitive deficiency that permeates everything they try to do. You’d never have hired them in the first place, really, because there’s no way a person this stupid would get through a job interview.

I feel like we're really bad at distinguishing deficits in intelligence from deficits in acculturation.

Siri is terrible.

"Call mom" sometimes calls my mom, who is in my favorites (I have 4 people in favorites: mom, wife, daughter, brother).

Sometimes it decides to call "wife's mom"

Sometimes it says "which mom" then lists "mom, wife's mom"

Mom, literally the word mom, is in my favorites. I use it almost daily.

And yet, Siri doesn't remember this, doesn't create relevancy, it doesn't even decide "Let's look at Favorites first".


FWIW, if you never refer to "wife's mom" as "mom", you can change the required pronunciation in the contact details. Then, Siri will only activate if you say "Call wiyefesmom"

Did it ever got out of BETA? it feels like , different years same story.. heck one guy sued Apple over siri ad in 2013..


the real race is between Google and Amazon for personal digital assistant.

Guess even cortana can beat siri day in day out.

I was completely confused this week asking Google for the "uk bank holiday". It answers it with no the in 3 days time, but the one next year.

Screenshot: https://twitter.com/Martin_Adams/status/1263578310266679296?...

Among a bunch of other services, the company I work for offers a database of locations and travel/tourism based info. Airports, attractions, etc, and I'm the lead on that.

Twice over 5 years I've gotten phone calls involving our CEO and a higher-up at an airline who was absolutely furious that we had been advertising London International Airport (YXU) as somewhere in Canada for a long time. I still remember how funny it was having to tell them to look up YXU on Wikipedia, and that they probably meant to search for "London Heathrow Airport" (LHR). I wonder if we still get those angry calls, but now people know the answer before it reaches me.

I've also made a similar mistake years ago. I was excitedly looking for a nonexistent building in the University of Miami (Florida), because I found online that the Miami University had a star-gazing club open to the public that met at 10pm once a month. Miami University is in Ohio... :(

I get this problem all the time and I live in "the right" London so it is not just a "nearest London" thing.

It is not just digital assistants, but so many other things like Google maps, e-commerce sites, address auto-completes etc seem to assume that I want the North American one with a population of 300k that no one knows about, not the one that everyone has heard of with a population of 9 million.

I've always just pegged it down to the usual cultural-blindness that we come to expect from SV companies that there isn't anything beyond north America (e.g. "global launches" only being for USA, Canada, and Costa Rica etc) and that if your language is "en" then you must be American or Canadian with everyone else being funny foreigners that "we don't support, sorry"

Just to report that it isn't necessarily/solely cultural myopia at work here: I live in Ontario and have family in London, Ontario, and while visiting them with location services on have both asked for and googled "what is the temperature in London, Ontario" (notice that I said the region!), and gotten results for London England. I left angry feedback with Google over it once, but it's happened many times.

Sometimes these systems just make baffling decisions. One time years ago, while waiting for my plane to begin deboarding at JFK Airport in NYC, I tried setting my MacBook's timezone to Eastern manually. I started typing "New Y" into the Closest City field. OS X helpfully auto-completed it as "New Yekeba, Liberia": https://evan.tumblr.com/image/65341168142

As best I can tell, New Yekeba is a tiny resettled former mining town with minimal population. There are very few English-language web pages about it. Somehow this ranked higher for Apple's auto-suggest than the most populous city in the US.

Looks like the autocomplete is going off alphabetical order rather than some other characteristic of the city?

I guess that's the next innovation. In the old days, you enter the entire search string. Then you got Google Suggest where each keystroke was trapped and responded to. Now not only you need to trap each keystroke, you need a weighted list of results so you show them in the order of popularity.

Not an Edit: I am pretty sure Google is already doing it so it is not an innovation, its been there for a while. I am calling it so for the folks who are not doing it. Also, it may be there in general search but it may have seemed like excessive data for searching time zones on your phone. But with billions of phone users, such usage would be common and the feature will happen, eventually.

Weighting results in Search Autocomplete is something Google has done since launching it.

(Disclosure: I work at Google, though not on this)

I considered that, but in that case it would have suggested New Yarmouth (among other places) first.

"e" is alphabetically before "o" ?

Hey, it's correct when sorted alphabetically! Good enough, commit and ship it!

>I left angry feedback with Google over it once, but it's happened many times.

That's not going to get you anywhere - they're not listening.

There is an entire new class of species breeding over there: "Artificial Humans" [1]: A quantum state thing that may or may not be a real human. You can never tell. This is beyond Turing test. A real human behaving as bot or vice versa.

[1]. https://trialx.com/blog/google-play-bans-our-covid-app-built...

On the bright side, screaming at the void can be therapeutic for some people...

The modern day equivalent of writing a stern letter and just leaving it in a drawer.

You could send it to The Times and see if it goes to the London version? ;)

It's not true. But reports are very valuable when filed with helpful information.

Simply use the airport code YXU ;)

To be fair, where I'm from, "London" would probably refer to London, ON, but that's just because I live in Southern Ontario. So it's not like no one has heard of it.

On the other hand: no one in Southern Ontario will be asking what time it is in London, ON because it's all in the same time zone, so this seems like a problem.

Well, and it's further complicated by the context of what's being asked. Coming from Waterloo Region, if I ask what time it is in London, I almost certainly mean in England. But if I ask for directions to get to London, I probably mean the one that's a few hours away by car.

So it's not as simple as just fitting these queries into generic templates and sending all instances of $LOCATION to a common geocoder.

Yes, but we expect those assistants to be a little smarter than college programming assignment...

Now if we could just wipe Waterloo IA off the map...

It really highlights how all language is incredibly ambiguous, and depends on all sorts of cultural and semantic contexts to disambiguate, despite the inherent ambiguity not being at all a problem for its users.

There's so much more to language than just a string of sounds or characters.

This is rather inconvenient when shopping for niche products. American website assumes that everyone on the internet is American, and you have to go through the shopping cart experience to learn that they don't ship to Canada or Europe.

See also: "Please select a state" dropdown with no "Not US" option.

TBF internationalisation is complicated and hard to do well. So is international shipping.

But compare with a European site like thomann.de, which not only handles currencies and taxes but also (mostly) translates the product descriptions into most European languages.

That's dedicated, having worked for a company that took care of online shops for clients, most of them would balk at the cost of doing that.

Indeed! Is it so hard to just state where you're willing to ship upfront? The worse is when they later send you emails because you "left something behind".

And too many app developers assume that all their users have massive/unlimited data plans. This is usually correct for... most developed and developing countries, but:

They forget that in the telecom-backwater of Canada, a 4 or 5gb/month plan is near the high end.

I’m looking at you NPR One app for thinking buffering 300-400mb of content is a good idea.

It's not just app developers. The Android devs only appear to spend their time on flagship phones and assume they can burn system resources that bring cheaper phones to a crawl.

Even fast devices have slow internet in the underground. So many websites load 3mb of data for a few kb of relevant info.

To be fair, there is a tendency among Americans full stop (with the honourable exception of the good denizens of this and other likeminded sites) to assume that everyone on the internet is American.

HN is not an exception. It's the most stark in threads about salaries.

I'm interested to see this thread but posted 6 hours later. This happens on forums that are very EU/NA heavy--or games.

The thread seems overly negative towards Americans and criticizes them heavily because half of America is still asleep and the other half just started working. If OP posted it later Americans would likely be filling the discussion with the reasons why your complaints are happening.

My personal opinion is that US-centrism is obviously a thing, but EU people are so used to seeing it that they then mistake other things for it.

This same thing happens with Spain and Mexico as well!

Hacker News is an American site, specifically a Silicon Valley site. If you went to a British site, they would be talking about things from a British context. All are welcome, but it’s a bit ridiculous to complain that every conversation on the internet doesn’t support multiple currencies.

Correction: Hacker News is an American-founded site with international community of users. That's not a site for Americans where the rest of the world just comes to watch. Or maybe I missed a point in the guidelines...

I'm not talking about currencies. I'm talking about sentences like "if you move to the east, you'll earn...", "in the south, it's..." in conversations with no prior indication that it's just about US and "the east" means "the east of US" and "the south" means "the south of US". Because US is the whole world and there is no need to clarify...

I don't think this is an American thing as much as it is a 'country with large population' thing. Internet users all around the world do tend to frequent sites based on regional preferences, and the human brain is exceptionally prone to identifying patterns. I'm sure if I were to hang out on Weibo or VK I'd find a bunch of people assuming everyone was Chinese or Russian.

In my view it's the language. English is so common everywhere that young people around the world are pretty good at it (including me), so they get onto the internet from a young age and start consuming English (mostly US) websites, social media, and movies etc, which further strengthens the grip the US has.

Meanwhile, you'd struggle to find an American, or any one from the Anglosphere really (maybe with the exception of Canada) who speaks Greek, French, Arabic, Vietnamese, etc.

Almost any results from DuckDuckGo have this affliction.

Go to https://duckduckgo.com/settings and change the Region to wherever you are for better regional results.

Was thinking this exact thing! All these companies are hugely US-centric.

It's like this on maps too (apple maps is the worst) you can search for something fairly straight forward that should be local and it'll find you something in California.

But I agree with the general theme of Siri being the dunce of the personal assistants. Mostly can't even hear its own wake word. You have to whisper Alexa's name when talking about her to prevent activation and in the same environment shouting at Siri produces nothing but silence.

It’s not US centric. It’s just that Siri, and the others (to a lesser extent?) are dumb.

Try asking Siri for directions to your next appointment. She will tell you all about it, including where it is supposed to be, but she won’t give you directions. Mind boggling.

And it’s not uncommon for me to search for some place in the Midwest USA, and get a result in Europe.

In the recent (~5y) past, I've asked Google Maps, from within London, a route to Charing Cross Station and been given the one in Glasgow.


That’s even worse than a pre-smartphone fail I encountered ~15 years ago.

I was on a train from London to Liverpool Lime Street Street, and somewhere around Nottingham one of the fellow passengers asked me when it would arrive in Liverpool Street Station.

From Cambridge (UK), you can take trains either to Liverpool or to Liverpool Street Station (in London). Refugees would often be released from a nearby detention center with a ticket and instructions to take the train to Liverpool. More than one ended up lost and befuddled in Liverpool Street.

[2nd-hand story from, likewise, 15 years ago. Accuracy not guaranteed]

See also: New York Penn Station and Newark Penn Station are consecutive stops on the North-East Corridor line, and sound identical in conductor-ese. Causes regular panic attacks from first-time riders.

had similar encounters with people asking how much longer it will take to get to Ashford as their Eurostar train leaves soon while on a train that stops at Ashford (Middlesex) instead of Ashford International, in Kent

cf my disaster in Italy when I got off the train at Venezia Miestra (basically an industrial estate at the wrong end of a road bridge) instead of Venezia St Lucia (the bit with the canals and architecture).

"Charing Cross (Glasgow)" Station in Glasgow is the true Charing Cross. I can only applaud Google Maps there.

You were looking for the vastly inferior "London Charing Cross"

Yet, if someone says Charing Cross in London, they do probably mean London Charing Cross nevertheless.

What makes your comment fun is that I can almost hear it in the angry Scottish accent of a Mike Meyers character. “If it isn’t Scottish, it’s shite!”

In the past few months I decided to give Apple Maps another try to see if it's any good. I tried to get directions from Manchester Piccadilly to Manchester Victoria. It tried to send me to Victoria BC, Canada.

Don't forget the "world series" for major league baseball played in the US and Canada. ;-)

It was the World Series long before Canada was involved. :-)

A tired quip that doesn't get more clever with time.

Guess what?

The "World Cup" didn't invite any soccer teams from Africa or Asia to their 1930 tournament but they still called it a World Cup. The UK wasn't even part of FIFA at the time but they still pretended it was a "world cup".

The baseball World Series has been contested since 1903. When are they going to invite the rest of the world?

Whenever they join MLB, just like you have to join FIFA to be in the World Cup.

Do they even accept applications? The FIFA application process is apparently not very easy, but 211 national associations have still managed to join.

Aye and nobody invited Antarctica to the most recent one, so it's still not a World Cup!

Even in 1930 it involved several nations, and that was /ninety years ago/

Along the same lines, I always laugh at announcers proclaiming the Super Bowl winners as the worlds best (or something along those lines).

I mean, it is true. But it still irks me for some reason.

For time you probably don't want the nearest. If you are 200km from London, Ontario you are most likely in the same timezone so you want London, UK. However for driving you do want the nearest.

Another irritating bit about this is that when I search for news about my country in my language, spanish (I live in London, the UK one Siri, not the american one) I still get a number of english language results first.

Did you search in spanish or english? Also local environment options - was in in english or spanish.

But the whole language and geographic association is one area in which many software experiences seem to get mixed results.

My pet gripe is when I install software and it offer English (American) and no English(UK) option. Though less impacting that some language handerlings, like yourself.

Can we please also take a moment to wish tiny inconveniences upon designers of applications which ignore my locale and present me with "11:06 AM" or worse, "04/05/2020"?

The best are that have a settings page where they try to give an example. But use a day-of-month =< 12. Like your example. It couldn't have hurt to use and example of "22/05/2020"; then it is still infuriating, but clear.

> It couldn't have hurt to use and example of "22/05/2020"; then it is still infuriating, but clear.

Assuming you're addressing OP with this part, I think that was their point.

When an application presents you with "04/05/2020", you have no way of knowing whether it means crazy American dates or normal-people dates.

This is reason enough for me to use 2020-04-05 exclusively. Lexical order is a bonus.

Just wait, someone will implement that as May 4, 2020.

> "04/05/2020", you have no way of knowing whether it means crazy American dates

In just about every professional setting I've been in (in the US), Americans are just as confused as everyone else. We deal with people and documents/data from the entire world too. We look for clues elsewhere, such as other dates from the same source that have any number greater than 12, etc. There are very few documents these days where you be 100% confident without some other confirmation.

I so much just prefer Apr-05 2020. Clear to all.

I once had to write software that would create labels with a date, and said software would be used in both the U.S. and Ireland. I tried to push for the ISO standard of 2020-04-05 but couldn't get buy-in. Was finally forced to compromise on 2020-Apr-05.

While clear to all [english speakers] it still has US bias. Outside of the US it's more common to write 05 Apr 2020

Until you sort lexically, then madness.

Are we talking BCE or CE? :)

Were there Aprils in BCE?

Yep, if you're presenting to international audiences then either one of those two is unacceptable. First one because in a lot of countries people genuinely won't know what AM means, the other because there's only one clear way of reading it, but it's not consistent between countries.

>if you're presenting to international audiences then either one of those two is unacceptable...

I used to have all my domain names registered with Namecheap. I'd get emails from them saying things like '<whatever>.com is due to expire on 3/7/2018' --so, naturally, [being based in UK] I'd note on my calendar to renew the domain at the beginning of July, only to have it almost expire on me at the beginning of March.

I emailed Namecheap several times about this, asking them to use less ambiguous dates in their reminder emails and, each time, got snotty replies saying "We are a US company and we use US date order" [which wouldn't have prevented them using something like "March 7 2018"] to which I pointed out that they had a global customer base, so should avoid expressing important data in formats which their oversees customers would mis-read.

They refused to budge on this so, as each domain expired, I renewed it with another registrar instead and Namecheap lost all my business, my repeat business and my future business for ever.

All because they were too damned stubborn to move away from using an ambiguous date format.


The AM isn’t the main problem. It can be looked up. The issue is whose 11AM? What time zone?

If someone doesn't know what AM means, they don't know English anyway.

It's Latin

It's used in English, therefore it's English.

Or calendars that start on Sunday (since I'm European and am used to calendars starting on Monday). Forget to notice the label once and one could end up booking a hotel room for Thursday to Saturday.

So most of Google? Google Calendar, Search, Maps and others all tend to be very stupid in that aspect.

I actually didn't realize am/pm wasn't something Europeans did...

UK'ians do use it. French, Germans, Swiss, BeNeLux, Italians, Spanish, Polish not. Don't know about the rest, but would be surprised if anyone in Europe but the UK used it.

For written times, no. If you're arranging to meet a friend you'd write "see you at 2030", but you'd say "eight thirty" out loud.

We love what US calls “military time”

24h clock isn't exactly military time.

14:30 in military time would be fourteen hundred thirty hours, or one-four-three-zero hours, or in written format 1430Z, Z being Zulu (UTC), or an actual offset.

"military time" is a 24 hour clock.

14:30 is almost without exception read as "fourteen thirty" in military time.

I've never seen it either of the ways you describe above.

They still know what it means though. Clocks are still divided into 12 in Europe.

Not if you don't have to use it for whatever reason. Personally I never learned the difference between am and pm before I got a job where I had to interact with people who used it. Still not sure if noon/midnight is 12am or 12 pm though ...

In Greece it's sometimes used.

10:00π.μ. = 10:00 προ μεσημβρίας = 10:00 before noon

10:00μ.μ. = 10:00 μετά μεσημβρίαν = 10:00 after noon

Transliterating this into the Latin alphabet is fun since it gives "p.m." for before noon.

Indeed! My mother once mistakenly put an alarm for 6π.μ. thinking it was 6p.m., and suddenly at 6am in the morning it was blasting off for no reason.

That's what living for so many years in another country does to your native tongue

Uh, they invented it. I guess they forgot.


  %d %b %Y %H:%M:%S %z
as God intended!

(and i’m an american!)

Is it a joke?

Year should be in the first column.

Agreed - really wish more people would use ISO date format: https://en.wikipedia.org/wiki/ISO_8601

I think formats with month and day before year are popular (and possibly emerged originally) for cases in which the year's only really included in case you're not sure whether you're reading something old and the date's already passed, that is, the date is referring to something 12 months or less away, which is probably a very high percentage of dates people encounter day-to-day. This puts the most useful information up front. The alternative being dates that omit the year entirely, which are also very common.

There’s a subreddit for that too. https://old.reddit.com/r/ISO8601/

  22 May 2020 09:28:30 +5
is a perfectly unambiguous date stamp for human reading.

by all means use iso6081 for everything else, but i genuinely prefer the above in daily human facing use.

it’s in commmon use in the US military, and at least passingly common elsewhere. Not something normal people do here, but it’s be nice.

"May" is English, though.

Otherwise, good format.

Seriously. All the apps that default to 24 hour time when it's not what my device is set to are annoying.

Would I be incorrect to assume that something like 17:00 will be understood by literally everyone in the world(if maybe not the preferred format) though? Using AM/PM is worse in a sense that say someone in Poland will literally have no idea which means which. 24 hour clock has no ambiguity. Unless I'm wrong and there are countries or cultures which literally won't understand what time 17:00 is?

I think almost everybody in European countries knows about AM/PM, but I certainly get confused by 11:59 AM vs 12:00 AM

I am from Poland, and while we understand 12-hour clock and use it interchangeably with 24-hour one (well, mostly old people, young people just use 24-hour clock) - nobody uses AM/PM.

In Polish it's 8 rano (in the morning) vs 8 wieczorem (in the evening).

I can't remember which is which in AM/PM, and have to check each time :)

And the issues with midnight/noon are frustrating and 12-hour-clock should just die.

I've never been able to keep 12 AM vs 12 PM straight in my head. If I have to put something on my calendar or set an alarm for that time, I always use 11:59 AM or PM.

Is there a mnemonic or trick for remembering?

>>I think almost everybody in European countries knows about AM/PM

Again, as a Polish person I don't think that's true at all, certainly in Poland it isn't.

On the topic: Siri will ask you to clarify if you set a reminder too close to midnight. "Did you mean 11am tomorrow?" It also works on the other side: set a reminder for "2" at 1am and it'll ask you to specify.

Never think of AM/PM except when dealing with badly designed software.

In the US, I have definitely met people who will profess not being able to understand military time (this is a common way to refer to 24-hour time). They would recognize it as a time, but not get any other context from it.

In the real London it is not entirely uncommon to find yourself in a conversation with American tourists where they're asking/complaining about "military time" on train timetables. The odd thing about it being most platforms have a countdown-style timer much like you'd find in a US metro system.

I understand that sometimes they're just trying to interact with locals while on vacation, but it always seems like an odd starter. Depending on my mood you'll get a friendly chat about time displays¹, or a curt "it says 2 minutes 'til next train".

1. 24hr analog clocks worth visiting https://en.m.wikipedia.org/wiki/Shepherd_Gate_Clock , or metric time, or Swatch time, or ... I'm just as boring with strangers as with friends ;)

17:00 is unambiguous. 9:00 could be AM or PM. 09:00 nearly always means 24H format; but what about 11:00?

Siri should just start using "Zulu" time. /s

https://en.wikipedia.org/wiki/Date_and_time_representation_b... with a nice svg on the side.

I wouldn't say "entire countries", but definitely a lot of people. I grew up (in canada) with 12 hour clocks.

Hell, I live in the UK and I can't stand 24-hour time ("military time"). I have to convert in my head every single time by subtracting 12 to figure out what the time actually is.

I hate 24 hour time so much. I won't use software that can't be configured to AM/PM.

I don't intuitively know 17:00 is afternoon. I have to think about it. You get used to what you grow up with.

If you are an American joining the military is a quick lesson. Dates are d/m/y and time is 0000hrs. Distance for anything that is not your PT run is in KM. I had lived overseas so not a big deal for me, but others in my unit struggled for a bit. I have forced doing both f/c, mi/km and 12/24 on my family from the start. Both my wife and kid where pretty happy I did when we started traveling outside the US quit a bit. No issues on if they needed a jacket, how far away it was and what the transport schedule said.

It also may or may not be evening [1]

1: https://en.wikipedia.org/wiki/Evening

There are huge swaths of Americans who will literally not know what 17:00 translates to in AM/PM. And even for the Americans that do know, virtually all of them (unless they are in the military or another profession that commonly uses 24h time), will have to "translate" by subtracting 12 in their head for PM times.

24h time is extremely rare in everyday use in the US.

The demographic you're referring to - North Americans within the USA who don't know how 24-hour time works - are < 2% of the world's population.

> 24h time is extremely rare in everyday use in the US.


> There are huge swaths of Americans who will literally not know what 17:00 translates to in AM/PM.

1. Same is true for most people in the world

2. Wait, you mean they don't know if 17:00 is early morning or late afternoon??

Seriously, I have a hard time to believe that. I'll believe that most Americans need to convert to am/pm, but not that "huge swaths" literally cannot understand this.

My expectations of the average USA'ian are better than that.

>subtracting 12 in their head for PM times.

Is that how you're supposed to do it? All my life I've been subtracting 2 and ignoring the fist digit for PM times (with the exception of post 20:00 times). I just noticed that when I was a kid and been using that all this time

e.g. 17:00 - 2:00 = 15:00 => 5:00

It's ridiculous TBH, whether its travel apps, or setting timezones in things.

For a lot of products, US sales are just worth so much more than sales from anywhere else, that it is reasonably common to not care much about the rest of the world. I’ve worked for a few companies that started out with a heavy focus on localization, but eventually realized it was just a massive waste of money.

The issue isn't doing business in just America. The issue is the communication. Some services will have big banners like "Works everywhere flawlessly" and then (if you are lucky) in some dark corner of the FAQ there will be a tiny sentence, "Only available in US for now" (translation: it will never be available anywhere else).

Even global companies like Google, Amazon do this.

In situations like that non-US consumers aren’t their customers, and they’re simply not interested in communicating with them.

This isn't true for either Apple or Google or Facebook, all of which are determinedly global in their sales and marketing.

They’re also very weak examples of companies that do this. For starters, Apples localization is quite on point most of the time. Facebook and Google really don’t have many features at all that are territorially exclusive.

Then why are they on the internet? Aren't they aware that non-US people can also access it? And if they aren't interested then why do they add "for now" to their message?

And are you telling me global companies like Google and Amazon don't have any non-US customers? I don't know which world is that but I want to live in it.

Google and Amazon aren’t particularly good examples of this. They don’t have very many services that are only available to US customers.

But there are plenty of businesses that only trade with US customers, and make little effort to ensure that’s clear on their websites. To answer your question, the reason companies like that don’t put special effort into making this more clear is because while some people might think this is a problem, it’s certainly not their problem. If somebody browses their website for 20 minutes, decided to make a purchase, and realizes they can’t, the business hasn’t lost anything. That person was never going to be their customer to begin with, and it’s not exactly easy to put together a business case for improving UX for people who will never be customers.

Im not really making any value judgements about doing this, I’m just giving you the perfectly rational explanation for how this occurs.

Chicken and egg problem, maybe the US sales being stronger are a reflection of the system's lack of ability to deal with international customers (even if only in English)

Though yes, cross-border commerce is annoying.

America is also incredibly wealthy in terms of mean disposable income - irrc about 45% higher than the UK for example, with a substantially larger population.

I mean the problem being described has nothing to do with US sales though. I'm sure if people ask for the time in London, 99% of they mean London UK irrespective of whether they are US based.

There is a known list of "Global Cities" which should be treated exceptionally: https://en.wikipedia.org/wiki/Global_city

The world is on a first name basis with these cities, and AI should know to treat them exceptionally.

There's an old episode of All in the Family where Archie Bunker loses his Christmas Bonus for shipping something to the wrong London, 1971 and way before Siri or computers.

This ain't a new problem.


My favorite Siri moment since like iOS 7 is when I ask Siri to turn off all alarms:

".......... one moment please ........... there's a problem with the network connection".

No. There. Is. Not. In fact, all alarms always get deactivated. All other queries immediately before or after work fine.

The failure rate when I ask that is no less than 100% for me. Just sad.

Shouldn’t they simply default it to the most populous city by that name, or some other criterion for “better-known”?

I'm not surprised that Siri gets things like this wrong, unlike every traditional search engine. Each text based search engine returns more than one possible response, and if they return the wrong answer first, they can monitor how often people click on other answers, and use that to continously train their engine. Apple does not have that source of training data.

However, the fact that so many people here defend the result due to "context" is a bit interesting. I would think that from both a technical and business point of view it should be clear what that there is only one correct result of the query given. I would be surprised if more than 5% of people making the query world wide (or in the US for that matter) wide would be interested in anything but the time in London, UK.

Is there some kind of Stockholm syndrome involved?

> I'm not surprised that Siri gets things like this wrong, unlike every traditional search engine. Each text based search engine returns more than one possible response, and if they return the wrong answer first, they can monitor how often people click on other answers, and use that to continously train their engine. Apple does not have that source of training data.

Amazon doesn't have that either. Google Assistant is the best when it comes to answering questions

I've returned to manually calling people manually from the Contacts app because of how frustrated I'd get every time I'd try to make a phone call through Siri. The whole exchange takes about 20 seconds:

"Hey Siri, call John Smith"

Siri: "Did you mean John Smith?"

"Yes John Smith"

Siri: "Calling John Smith"...

I had a boss, Joe Chan. My wife is Jen. A nonzero portion of the time, I'd say something like "Hey Siri, text Jen I love you" and would get back "OK, texting Joe Chan I love you".

I've texted Joe approximately never. I've texted my wife 8 times this morning. If there was any ambiguity at all, could you, uh, optimize for the contact I actually contact?

It seems like this would always be a hard task, especially once you get down to the smaller cities.

Shortly after Siri launched, I would be sitting in the Bay Area asking Siri “What’s the weather like in Pasadena?”. Siri would return the weather in Pasadena, TX, not Pasadena, CA.

The reason I could come up with why Siri returned the Houston suburb instead of the LA suburb? Pasadena, TX has a higher population (149K vs 141K), so it returned the one that had the higher population, even though the California one was much more prominent.

Seems like Gruber’s experience led to an overcompensation. It’s similar to a bug where I was actually in Pasadena, CA and asking for weather in Santa Barbara, and Siri would return Santa Barbara Island, not Santa Barbara, CA, even though the island was closer to me as the crow flies.

Siri is still bad (and perhaps indisputably the worst assistant), but the competition in other similar areas, where text is typed, don’t seem to be any better. When I’m on a browser and go to Bing or Google, they try to guess my location from the IP address and show news from/around the location. Move to maps, and suddenly it’s like using a product from a totally different company that wants to avoid using IP address for geolocation (maybe because someone didn’t like it for some other purpose). Start typing a street name and the autocomplete list would above a bunch of places in the US (it’s almost always the US) until I finish typing the city name (and sometimes the state or country too). Goes to show how poorly these services are designed.

> Why in the world would you get a completely different answer to a very simple question based solely on which device answers your question?

Because the phone has location services enabled and the pod does not? Just a guess but they are certainly operating under different contexts.

> Bing got it right, with bonus of analog clock.

I tried Bing and wow, the search UX is pretty awesome, it's like pixel copy of Google nowadays, they got the sign in button on right top corner, a image search icon on the right of search bar, and images of London on right side of page etc. But I just haven't been using Bing for years. Am I going to use Bing? Probably not, unless Google becomes unstable, which I don't foresee. For a commodity (free) Internet service like search engine, foster user habit is the key for adoption - if you got a piece of mind in users, they will stick around. Typing a different url is just counter productive since it has to deter mussel memory.

> If you had a human assistant and asked them “What’s the time in London?” and they honestly thought the best way to answer that question was to give you the time for the nearest London, which happened to be in Ontario or Kentucky, you’d fire that assistant.

If I had a factory in London, Kentucky and I worked in Seattle, and I never did business in the UK, then I very well might mean that one. But no, I probably would never mean just the closest London.

This is a hard, hard problem. Names can indicate multiple entities, and it is context-dependent which they mean. Men get this wrong all the time; I see no reason why machines would be better, and plenty of reason that they will be worse in the near-term.

Okay, I knew I was stupid already, but if I was an assistant, and smart enough to know there was more than one London (which I've until recently not been), I'd have asked "Which one?" in case there was not enough context to make a guess. There are probably _A_LOT_ of people who have a lot more to do with the non-british London who'd be very sad indeed if The Machines (tm) suddenly decided that EVERY request about London MUST ALWAYS be about the British one..

Kinda like how I'm eternally sad when I get german results, just because my connection is from Germany, but I am Danish, sitting in Denmark and am usually NOT looking for the nearest biergarten (lies, I am, but we have none).

There are probably _A_LOT_ of people who have a lot more to do with the non-british London who'd be very sad indeed if The Machines (tm) suddenly decided that EVERY request about London MUST ALWAYS be about the British one..

There will be a lot more people who are asking about the British London than people who are asking about their nearest London. Apple will be making far more people sad by defaulting to the local one. If Siri asked "Which London?" every time most people would get annoyed.

What should happen from a UX perspective is that Siri should respond with "The time in London, UK, is..." which would inform the user that Siri is differentiating between different Londons, and that they need to specify a locality in order to narrow down the query to some other London if necessary. 99% of the time the user will get what they wanted first time, and the other 1% of the time the user will be informed that their query wasn't accurate enough.

> What should happen from a UX perspective is that Siri should respond with "The time in London, UK, is..."

This is exactly what Google assistant does. You can follow your question up with "what about London, Canada" and it will say "The time in London, ON, Canada is...".

There are two things a good assistant will do: 1. save you time by making reasonable assumptions and 2. inform you which assumptions it has made.

Selecting a default behaviour based on what "most people" will benefit from is great some cases, but mostly cases people will encounter only once.

People who happen to fall beside the "most" category should not have to deal with that every single time.

The right thing to do for Siri is to ask, maybe just once, which london you want to know the time for, if its not smart enough to infer it from something else it's learned about you.

Assuming a person is physically closer to the non-british london, it's entirely likely that they travel there more often, and if the siri software has no other parameters tracked, then it's not unreasonable to assume that they're asking about that.

Assuming a person is physically closer to the non-british london, it's entirely likely that they travel there more often...

It's also reasonable to assume they're in the same timezone as 'their' London and wouldn't need to ask what time it is in the local one. That's another reason to default to one further away, which for most people would be Britain.

> Assuming a person is physically closer to the non-british london, it's entirely likely that they travel there more often

It really isn’t. I can see this being the case for people that live within, say, 100 miles of London, ON - but why would they want to know the time there in that case?

I would guess that there are more people in North America who are interested in London than people in North America who are interested in London, Ontario.

See what I did there? London (which is a major global city) doesn't need a qualifier anymore than New York does.

You're right that some people want to know the time in London, Ontario. But most people who ask the time in London don't.

In the absence of context, the right thing is to give the time in the UK. If more context is given: "What time is it in London, Ontario," then obviously it should give the time in Ontario, and perhaps remember that this person is an exception to the rule.

That would get annoying very soon. If you look hard enough, almost any question is ambiguous.

I think the best thing to do is to pick one interpretation, answer that, but make it clear which London you picked (“the time in London, England is 12:34”), and then, handle “no, the one in…” correctly.

Falling back to “which one do you mean?” Should be the exception, as a voice assistent that can’t make that initial guess right most of the time isn’t worth using.

> I think the best thing to do is to pick one interpretation, answer that, but make it clear which London you picked

That’s actually the exact interaction you get with Siri today (I just checked):

U: What time is it in London?

S: It’s 7:09 AM in London, Canada

U: No, London UK

S: It’s 12:09 PM in London, England

The only problem is she doesn’t really learn this for subsequent requests, although that is a different problem IMO.

> The only problem is she doesn’t really learn this for subsequent requests, although that is a different problem IMO.

That's kind of the 'how do you take your tea?' problem, which I think really annoys users.

(context: the first or second time you make tea for someone, you ask milk/sugar etc. Ask it after that and they're going to think you're pretty rude)

Asking for clarification is a GREAT response. I'd be happy if a human assistant OR Siri were to respond that way.

It sounds like a good idea until you realize how many interpretations any given question can have, and how many of them are extremely unlikely.

"How far is Jupiter from the sun?" "Jupiter, Florida; or Jupiter the planet?"

"How old is Donald Trump?" "There are 21 people with that name, do you mean ...?"

"Call 911" "The emergency number or the English 90s boy band?"

TBH I think there are only two cases where you would answer with London, Ontario time:

- you are from London, Ontario and are traveling in another TZ

- you are in London, UK

In all the other cases, without no further context, Londo UK should be used because it's one of the "capitals of the world"

Your #2 is not good. I live in London, UK and when asking "What's the time in London", I mean London, UK.

You may ask why do I need to specify London at all? Here's one of the possible scenarios, sounding very normal as human communication goes:

(I'm in London wanting to phone my relatives in Moscow, which is several time zones away. I don't know the current time in either Moscow or London, and I don't know what's the time difference between the two as not all countries observe DST)

Me: "Hey Siri, what's the time in Moscow?"

Siri: "It's 10:35PM in Moscow, Russia"

Me (thinking about whether I'll finish speaking in time for dinner): "Hey Siri, what's the time in London?"

Siri: "It's 8:35PM in London, UK"

Asking "What's the time?" instead of "What's the time in London?" sounds very unnatural here.

Fair enough, although I do see it as a not so common case. But maybe it is more common than people wanting to know the time in London, Canada while being in London, UK.

It's like the "Shirt Without Stripes" problem that was posted here: https://news.ycombinator.com/item?id=22925087

I'll throw in a broader question that's derived from the apparent declination in praise toward Apple by famously apologetic writers/podcasters. What the hell do people buy new tech—and more specifically iPhones and iPads—for exactly? Where can the value proposition actually be made unless you're an illustrator or very frequent camera user? I have an older iPad, 2019 mbp, and One Plus 3 phone. They're fine, but not really any better than the crap I previously had, and struggle to see how newer stuff would markedly improve anything, save for apps working that stopped supporting my devices.

I must admit I have grown weary of appending “the country” to every Google search I make related to Georgia.

Also, for a long time, searching Google for “London bridge” returned an incorrectly labelled photo of Tower Bridge as its top result.


To be fair, most people think the tower bridge is the London Bridge, not just a London bridge.

Well yes, but surely Google showing the wrong information is not helping. I can understand if the argument is that Google is trying to show what the user intended, and they had correctly surmised that the user is wrong. They do a good job of this with misspellings. Perhaps they ought to do something similar in this case too?

[user Googles ‘London bridge’]

Google: Showing results for Tower Bridge (click here to instead only show results for London Bridge).

DuckDuckGo is usually guilty of this as well, usually picking up a less relevant city in the Americas rather than the more relevant city in Europe (when doing a generic query of "something" in "city")

Modern ai has no “common sense”. Worse yet, no one knows what it is or how to add it. It’s why self driving cars can’t really drive, search results produce nonsense, speech recognition barely works, robots can’t do much.

He says duckduckgo got it right, but that not true, he got lucky.

When I ask DDG for "the weather in Nottingham", it often answers with the weather of a tiny town in the US, instead of the major city in the UK.

Duckduckgo is also dumb.

DuckDuckGo just delegates to other search engines.

Just a guess but when the author checks Siri on his phone it has access to GPS so is probably giving him the nearest 'London'. When he checks on HomePod it has no GPS so gives him the most common 'London'.

Although it's infuriating and we want these tools to be much better I'm not sure how Siri can really ever know which London he wants. A human assistant without context is going to need to make a guess too and isn't going to be right with that guess 100% of the time. It would make much more sense to request "London, England".

Although Google Maps now fixed this, this was quite a bit of problem for the longest time. The place is Budapest, Hungary. Place names are unique within districts but not within the entire city. Entering Deák tér (tér meaning square) into Google Maps for the longest time has brought up a tiny speck of an insignificant location in the boondocks instead of the very center of the city where all three underground lines cross and the airport bus terminates. The thing is, the latter officially is called Deák Ferenc Tér... but noone alive calls it that.

You have to know the time in London Canada when you're king, you know?


Apple fan, but I had to disable siri after the false activation count surpassed the useful information count.

They should figure out how to hire ‘alexa’ as the back-end like they hire ‘google’ for search, etc.

The problem with these devices is they don't ask for more context, like a human might. Sometimes they use telemetry for context, but most don't seem to keep any memory of previous questions asked.

If Siri asked: did you mean London, Canada or London, England, and then perhaps weighted that response in the future, that would probably end up creating a feedback loop where it would get quite good.

In the case of Siri, Apple has said they are very committed to privacy, and this might preclude this type of solution.

Siri needs to listen to a user’s response after presenting answers. The user hive mind would clearly flag problematic replies worthy of further examination.

Two scientists walk into a bar.

The bartender asks what they would like to drink.

The first scientist says "I'll have a glass of H2O please." The second scientist says "I'll have H2O too." The bartender gives them both water because she is able to distinguish the boundary tones that dictate the grammatical function of homonyms in coda position, as well as pragmatic context.

Siri, be like that bartender.

https://youtu.be/3Qi3tT2Xulk This is famous cut from the famous Polish comedy from 1981. The man is trying to send a telegram to London. (Londyn in Polish.) The lady says: "There is no such a city like London. There is Lądek. Lądek-Zdrój (popular spa in Poland)."

Last night I asked Amazon Alexa to clap for NHS, darn thing started singing a song about science - well embarrassing when was expecting a loud clap to join in the chorus of neibours as done my wrist in and yet no, it started singing some song about science loudly and yeah, don't even want to think what my neibours thought if they heard that. Still could of been worse.

They had the same problem with Paris when Apple released the version of OS X that introduced widgets (was it 10.4 aka Tiger?). In their demo there were screenshots with the weather widget set to Paris and showing it will be sunny all week long. That's what tickled me. And indeed, "Paris" was equivalent to "Paris, Texas" for the weather widget.

This page says there are 29 places around the world named London:


That suggests to me if the answer was not the obvious one and only London England, then the only sensible answer should have been a question asking which other London did you mean?

It’s true, but if you had to pick an arbitrary London, it seems like either largest item closest would make sense. It’s weird it’s neither.

Yep, Siri is absolutely useless, I have no idea how people mention it it in the same breath as Alexa or Google Assistant(although these are also incredibly brain dead in their own ways). My favourite one with Siri was when I said "navigate to X, Birmingham" and it plotted a route(somehow??) To Birmingham, US(I live in the UK). Like, that's incredibly bad.

I tried Siri sometime in 2012 or so. Came to the conclusion that it was tuned for the US so I disabled it and never looked back :-p

A commenter here felt the need to clarify "Manchester" the other day https://news.ycombinator.com/item?id=23263165

So not sure if it's true that a human would automatically assume London, England - especially if you're in the US or Canada.

If Apple is going to be a services company, they need to make their services actually good.

Could you imagine if their hardware was as bad as Siri? We’d never accept it.

I wonder if the core Siri tech is just long in the tooth, but Apple is so pot committed to it, they can’t change it without a full rewrite.

Asking what is “1000+5%” in MacOS Spotlight (command + space) has a different result than using the calculator or Siri. It doesn’t make it less confusing that spotlight is even showing a calculator icon when doing that!

Spotlight: 1,000.05 Siri: 1,050 Calculator: 1,050

I wonder now what else I screwed up by using the spotlight shortcut.

Disambiguation is hard?

Once I asked Siri for the current calendar week and got the answer 'I don't understand your question'. I reworded my question several times, still without success. At best it wanted to make a new entry in the calendar app. In most possibly useful cases it simply fails.

I mean I get the complications but if I'm living _that_ close to London OH, chances are that I won't be asking the time because it'll probably be same as my own timezone. If I am living multiple timezones away, then I probably want to know about London, England.

The problem with Siri is that it's always tried to be too smart for its own good.

Remember back when asking it tried to force-feed everything through wolfram alpha? https://i.imgur.com/68pLeIQ.jpg

Siri is also absolutely unusable if you use more than one language on a daily basis.

Or if you have a lot of contacs with non-English names. Siri never gets this right and you always have to reverse-engineer a 'fake' English pronounciation of the said name.

I submit that we don't have the wisdom to tell an overfit solution from a genuine decent work in progress with rough edges. Worse yet, overfitting makes for nicer demos quicker, so genuine work is always at a disadvantage.

When I ask Siri to navigate to a contact, she won’t do it unless the contact has an address, even if the contact is tracked on Find My.

You’d think that after all those years Apple could have made the investment to program this feature, no?

Interestingly, if you reply with just “no, London England” it does remember you were asking about time and gives the correct answer. However, the next time you ask for time in London, you’re back to Canada.

During transcription, Siri frequently thinks that "exclamation point" -- the common punctuation -- is me trying to add the phrase "excavation point," a phrase nobody ever says.

Siri on Apple watch is the biggest piece of shit i've ever used. The number of time it chirps up for no apparent reason and answers a irrelevant question happens daily.

Apple make nice stuff. They suck at big data.

> At least when most computer systems are wrong they’re consistently wrong.

I disagree.

I hit this all the time with "Salt Lake City", it gives me the time in India. Despite the fact that my phone should know that I have tens of contacts in Utah, but few if any in India.

I've never even heard of a Salt Lake City in India before...

Wow,that is seriously messed up, because it's wrong even IN India, as they would of course refer to the city as Bidhannagar or just "Salt Lake" what with "City" not being part of the actual name.

Hmm, maybe that is how I'm triggering it? Asking what time it is in "Salt Lake"? Thanks, I'll play with that and see if it makes Siri behave better.

Is that with Siri or something else?

As much as the big players have been pushing it, I still have zero desire to speak out loud to any computer.

Probably going to get labelled as a mental disorder for feeling like this, in the coming decades.

I am more likely to be given directions to a thousand miles away than to the only restaurant in my hometown with the word “Whatever” in the name (that’s in my map favorites, too).

Another huge problem for these assistants is unusual band names. I have yet to successfully convince either Siri or Google Assistant to "play Einstürzende Neubauten".

Even worse when the band name isn't written with latin characters to begin with.

Siri is not an AGI (yet?). And when people say "Siri is stupid", they are talking about the engineers creating Siri. And I am sure Apple engineers are not stupid.

My iPhone give me the right answer - London, England. Not sure why he got a different answer. And I’m in Texas about 120 miles away from London, Texas.

Try Googling for "Liverpool". It's a major UK cities with a metropolitan area of 2.2 million. Yet Google only gives you answers about the Liverpool football club.

Same for Manchester, made doubly difficult by City.

This worked fine on every iOS, iPad OS, macOS and watchOS device I have. I’ll try later on tvOS and report back if I can reproduce anything of what Mr. Fireball claimed.

I can also confirm iOS, MacOS, WatchOS, and my AppleTV picked London, England.

Side question. Do you use Siri/GA/Alexa/Alternatives for something serious? (which can cost you something if not done properly)

I recently asked Siri for directions. Between two matching locations, she picked the one 30 miles away instead of the one 2 miles away.

What time can I get shirt without stripes in London?

If I ask Siri on my iPhone "What time is it in London?" here in Germany, it replies with the time of London, England.

Because it assumes you want to plan V2 rocket attacks.

Because it's closer than London, Ontario.

Does anyone know of any company that is working on a conversational OS like as was seen in the movie Her (2013)?

This doesn't apply to Chicago for some reason. Siri got it right here. We must live far from any local London city.

It's likely the closest 'London' near the user. Side effect of living in former British colonies.

Siri gets it right for me, even though I am probably 6000 miles from London, England.

Maybe it's because I am not near enough to another London for the algorithm to make the assumption that I am talking about a "local London" and instead figures since they're all pretty far away, I probably mean the more well-known London.

Or maybe because I've traveled to London England but not any other Londons (That I can remember)

Or possibly because I have London, England in my World Time app.

Where would you draw the line though?

"What is the weather in Manhattan" probably means NYC, unless you are near Kansas...

But wildcats already know they need to suffix that one with a state qualifier.

Time for you to get a watch.

- I'll see myself out.

I live in Western Canada, and whenever I tell someone my sister lives in London, they guess I mean Ontario.

I mean the UK.

Meanwhile, Elon Musk claimed that Tesla self-driving technology will be worth more than $100,000.

What can we teach Siri? I've made Siri to remember who are my family members.

There’s a joke that every place name followers by a pause and Ontario is made worse.

Siri on macOS freezes when I say “call dad” while works great on iOS.

Apple should just integrate Google Assistant, it's second to none.

The mistake Gruber is making here is that DDG, Google, Alexa, and Bing did not "get it right". Their answer wasn't wrong, but it's not because they "got it". They never "got it" any more than Siri didn't.

AI is a story we tell ourselves.

Of course they "got it". When any human says a context-less "London", except maybe people literally in London ON and surrounding areas, they obviously mean the London. It's a little human communication nuance but understanding "what I mean" vs bare "what I say" is a huge part of getting AI-ish assistants to be useful.

Google, DDG, Alexa, and Bing don't "get it", meaning they don't shared a contextual framework that resembles our own. These algorithms mimic a shared contextual framework, but it is mimicry. That is the fundamental reason why they don't know how they are wrong when we decide they are.

Of course news travels fast. Someone on the Siri team fixed it.

Siri doesn’t know that my wife it’s also my spouse. Enough said.

you can set that up in the contacts app.

Gold content here for Siri's PM team if they are watching.

"Hey Siri, how many days are left in the year?"

Literally not reproducible. A lot of bluster over nothing.

I just tried it and my watch gave me the time in London, KY. I live in Atlanta.

For fun,

me: Hey Google, spell Elon Musk's son's name

List of cities that are SMALLER IN POPULATION than London, Ontario.

Tampa Wichita Cleveland Anaheim Honolulu Saint Paul St. Louis

I think the author doesn't grasp the depth of his own pettyness.

"Nilay Patel asked this of Siri on his Apple Watch." Am I the only one to be shocked by the situation? Isn't it the purpose of a watch?

We've reached peak Daring Fireball

Think we passed that a good while ago to be honest. Used to be a good read a few years back but just comes across as damage control at this point. "Siri isn't very good" is hardly a controversial opinion in 2020.

“Hey Siri, turn on all my alarms”

On that note, anybody know how often poor souls have booked flights to the wrong "Ontario, CA" because of this mess?

Would you trust Siri or any assistant to book a flight for you?

Sorry, I think my comment was confusing. By "this mess" I wasn't referring to the assistant screwing up, but rather the searches themselves bringing up an unexpected location with a similar name, even if you typed them yourself.

Google AI is AMAZING! Siri is Dumb!

Siri, What Time Is It in London, UK?

They are busy making TV shows.

"you’d fire them because that one wrong answer is emblematic of a serious cognitive deficiency that permeates everything they try to do. "

Oh, if only, John. But then, who'd write for your blog?

(if you feel that's unwarranted: "Daring Fireball" was the outlet that wrote a character assassination piece on rms, backed by some irrefutable evidence, that turned out to be about esr, and nobody performed even the most casual of fact checking, and it's still up there with some sorry-not-sorry half-hearted retraction, and probably all because rms told Jobs they couldn't grant him an exception to turn gcc proprietary eons ago.)

For the interested, here is Gruber's explanation, correction, and apology.


I don't think "I sincerely and deeply regret the error." is fairly described as "sorry-not-sorry half-hearted retraction".

What are RMS and ESR?

Richard Stallman and Eric Raymond.

Apple really isn‘t on track regarding many important trends and even in areas they used to rock.

- Virtual Assistants: After almost nine years of development, Siri still is only really reliable doing most basic tasks like setting timers or reminders.(1)

- Smart Speakers: The HomePod is way too expensive and still the whole package really is no match for Amazons or Googles alternatives. Too inflexible and the "Smart" part is laughable (see above). While Amazons and Googles smart speakers are a common sight in many households today, Apple is sitting on the sideline for years.

- Laptops: The debacle around their Butterfly Keyboard design was only resolved recently … after almost four years of massive problems. The new 2020 MacBook Air has a thermal design that is weird, to say the least and according to reviewers has heat problems despite being cooled actively, when comparable laptops are cooled passively.(2) The 2020 MacBook Air and the MacBook Pro 13" have screen-to-body ratios that are outdated in comparison with the competition.(3) The MacBook Pro 16" has speaker issues, display issues(4) and may overheat when used with an external display(5).

- Desktop Computers: The Mac minis hardware is outdated and its price really isn't compelling. If you want to buy a proper external display from Apple to use with your MacBook or Mac desktop (mini/pro) and are no Hollywood Studio Video editor that needs a 5000$ XHDR display, you are out of luck. If you want to do Machine Learning and the likes on your new Mac and need NVidia CUDA support, you are out of luck.(6) The whole "modern CPU with macOS" topic is a sad one.

- Webbrowser: When surfing the web with Safari you will encounter a rising number of websites that tell you to upgrade to a more modern browser like Chrome. Many web developers will tell you that "Safari is the new IE".(7)

- TV: Apple TV pricing is not competitive and the Siri remote is so bad that people buy 3rd party replacement remotes.(8)

1) https://www.inc.com/jason-aten/alexa-vs-google-assistant-vs-...

2) https://www.forbes.com/sites/brookecrothers/2020/04/18/does-...

3) https://www.ultrabookreview.com/21772-laptops-small-thin-bez...

4) https://www.laptopmag.com/news/uh-oh-apples-16-inch-macbook-...

5) https://talk.macpowerusers.com/t/16-macbook-pro-gets-excessi...

6) https://gizmodo.com/apple-and-nvidia-are-over-1840015246

7) https://www.safari-is-the-new-ie.com

8) https://www.theverge.com/circuitbreaker/2019/12/9/21002605/a...

It is amusing that someone thinking London means London, Canada should be fired for stupidity, but someone who cannot compute what is the time in another place by adding 7 (or 10 or whatever) hours to his current time and needs an elaborate device to do it is seen as smart and insightful. This is some pretentious BS.

> but someone who cannot compute what is the time in another place by adding 7 (or 10 or whatever)

The whole point of asking is precisely because I dont know how many hours I have to add. And neither do I know if said country, region or city is having whatever DayLight Saving hours etc.

> The whole point of asking is precisely because I dont know how many hours I have to add

You know, how about you learn it? Once. Instead of asking Siri or Google every time.

In other words, learn to fish.

There are a lot of cities in the world, though.

> who cannot compute what is the time in another place by adding 7 [...]

As someone who frequently talks to colleagues across timezones that are -9, -2, or +3.5 hours from me, where each of these have different dates for summer time, or don't have summer time, and meeting recurrences gets skewed etc, I don't see how asking about the current time off a location could be seen as "pretentious".

Yes, I do this frequently too, and what I meant to be "pretentious" was not having this problem, but writing a post about it and discussing it on a news site. This is like writing a news story about the lint in my pocket.

My reply to the deleted comment by dkdbejwi383

> Haha yeah you're so much smarter than him. Somebody should give you an award!

I think his post is motivated by the kind of pretentious assholery that emanates from the kind of world view, or view on other humans that the article rerflects, that assistants are objects to be fired over whatever opinion their "owner" has on what is to be considered stupid or not.

"Oh! My assistant is so STUPID they didn't think about this thing the same way as I did, they should be fired, lowlife trash!"

Yeah, first in class entitled assholery right there.

Guidelines | FAQ | Support | API | Security | Lists | Bookmarklet | DMCA | Apply to YC | Contact