Hacker Newsnew | past | comments | ask | show | jobs | submit | iainmerrick's commentslogin

I think the secret of Ryanair is that their goal is actually to make their turnarounds as fast and efficient as possible, not explicitly to make money by adding a fee for every little aspect of the service.

If anything can possibly slow down flight boarding, disembarking or cleanup, they'll first try to remove it completely, and only if people object too much will they reluctantly offer it with a fee.

Pocket on the seat back -> most people don't use on short flights -> get rid of them.

Luggage -> most people need this, but not everyone -> charge a fee.

Reclining seat -> most people don't use on short flights -> get rid of them.

They do sell drinks and duty free; that's an interesting one. I guess once the flight is airborne, the flight attendants aren't really doing anything else (from management's perspective) so they might as well sell stuff. Plus the trolley blocking the aisle stops passengers from moving around, which they probably see as a big advantage.

I think this even applies to the ridiculous penalty fees they charge for e.g. trying to check in at the airport rather than doing it beforehand on the app. It feels like they're just trying to rip you off, but I suspect they see it more as a "nudge" to make people check in online, because that streamlines their airport process.

I got a little bit less annoyed by them when I realised this. Sure, it's still uncomfortable and sometimes infuriating, but it's all with the aim of an efficient and reliable service, and they're way better than average at that.


> It feels like they're just trying to rip you off, but I suspect they see it more as a "nudge" to make people check in online, because that streamlines their airport process.

I believe the airline pays the airport for every check in and luggage handling transaction. They are just cutting costs.


That's not (really) it.

Ryanair makes little to no money from passengers, nowadays it's mainly from selling airplanes. They were still profitable during COVID without even carrying passengers at some point, only thanks to their flying school, which thanks to social dumping and the UE, allow them to charge 40k€ per wannabe pilot without even guaranteeing them a hire.

They booked 2000 737max, with their own special version during COVID+MCAS disaster, they paid it dirt cheap.

Then they operate them marginally, and now that the traffic has gone up again and the delay between buying and receiving a Max is about 8 years, they sell them back for a huge profit.

It's been known for ages in the industry.


Do you have a link for that? It sounds interesting but a bit unlikely. It's hard to see how charging for pilot training, even at 40K a pop, would be a sustainable business.

The thing about buying planes is also interesting, but sounds like a sneaky business move rather than the actual foundation of the business.

I've always heard that nobody really makes money from passengers, which is why airlines are always going bankrupt, and I'm sure Ryanair's margins are super skinny. But even so, it does seem like moving passengers around is the core of their business, rather than it just being a front for something else.


I never thought of it this way, but now it's clear.

I found that once I tack on luggage, a seat with more space, etc.. they become more expensive than traditional airlines with the same package.

In other words, their business model really seems to be to cater to the "least hassle" passengers who travel light and don't need any extras.


Great analysis and insight! Thanks for sharing

Right. Anyone who thinks this is not a real threat is sadly deluded.

However, if you think through the scenarios, the US is in a very strong military position and there's not much Denmark or the EU as a whole would be able to do about it. They could threaten a direct military response, on the basis of making the annexation more trouble than it's worth, but then you're just playing chicken with a significantly crazier enemy.

Most likely the EU would try to calm the waters, and offer a compromise peace / surrender plan along the lines of the one the US has offered Russia in Ukraine.

The biggest obstacle ought to be political opposition and public protests within the US itself, but right now the US government is in a position to just move fast and make things happen, what with the weak Congress and compliant Supreme Court.

I hope and believe it'll become much less likely after the midterms, with a Democrat-led Congress motivated to push back against the executive's excesses.


It would start off with a complete boycott of anything US made and that will result in a lot of irreversible damage.

This is a path to madness and I'm really surprised that there are no saner heads in the US putting a stop to this before it gets even further out of control. I'm even more surprised at how many people in the US support this, either tacitly or even outright.


Most UX researchers today can back up their claims with empirical data.

HCI work in 1992 was very heavily based on user research, famously so at Apple. They definitely had the data.

I find myself questioning that today (like, have these horrible Tahoe icons really been tested properly?) although maybe unfairly, as I'm not an HCI expert. It does feel like there are more bad UIs around today, but that doesn't necessarily mean techniques have regressed. Computers just do a hell of a lot more stuff these days, so maybe it's just impossible to avoid additional complexity.

One thing that has definitely changed is the use of automated A/B testing -- is that the "empirical data" you're thinking of? I do wonder if that mostly provides short-term gains while gradually messing up the overall coherency of the UI.

Also, micro-optimizing via A/B testing can lead to frequent UI churn, which is something that I and many others find very annoying and confusing.


I there not any user testing as we know it today, mostly top down application of priciples.

This was all experts driven in that time to my knowledge.

Empirical validiton did not really take off until the late 00s.

https://hci.stanford.edu/publications/bds/4p-guidelines.html

Don had the explicit expert knowledge first stance in 2006 and 2011, nothing inherently wrong with that, but it's defenitly no research driven.

"Always be researching. Always be acting."

https://jnd.org/act-first-do-the-research-later/

Tognazzini and Norman already criticized Appple about this a decade ago, while the have many good points, I cannot shake the feeling that they simply feel like the were used to just brand Apple as user friendly in the 90s and that Apple never actually adopted their principles and just used it as it fit the company's marketing.

https://www.fastcompany.com/3053406/how-apple-is-giving-desi...

there are a bunch of discussions on this

https://news.ycombinator.com/item?id=10559387 [2015] https://news.ycombinator.com/item?id=19887519 [2019]


That's interesting, I hadn't heard that point of view before.

Empirical validiton did not really take off until the late 00s.

https://hci.stanford.edu/publications/bds/4p-guidelines.html

Hmmm, I don't quite see where that supports "Apple didn't do empirical validation"? Is it just that it doesn't mention empirical validation at all, instead focusing on designer-imposed UI consistency?

ISTR hearing a lot about how the Mac team did user research back in the 1980s, though I don't have a citation handy. Specific aspects like the one-button mouse and the menu bar at the top of the screen were derived by watching users try out different variations.

I take that to be "empirical validation", but maybe you have a different / stricter meaning in mind?

Admittedly the Apple designers tried to extract general principles from the user studies (like "UI elements should look and behave consistently across different contexts") and then imposed those as top-down design rules. But it's hard to see how you could realistically test those principles. What's the optimal level of consistency vs inconsistency across an entire OS? And is anyone actually testing that sort of thing today?

I cannot shake the feeling that they simply feel like the were used to just brand Apple as user friendly in the 90s and that Apple never actually adopted their principles and just used it as it fit the company's marketing.

I personally think Apple did follow their own guidelines pretty closely in the 90s, but in the OS X era they've been gradually eroded. iOS 7 in particular was probably a big inflexion point -- I think that's when many formerly-crucial principles like borders around buttons were dropped.


Like the whole recoverability paradigm, seems more like a feature from developer perspective looking for a reason to exist, than a true user demand.

You have state management for debugging purposes already, so why not expose it to the user.

As an example in photoshop no non-professional users care about non-destructive workflows, these things have to be learned as a skill.

Undo is nice to have in most situations, but you can really only trust your own saves and version management with anything serious.

Sonething as simple as a clipboard history is still nowhere to be found as built in feature in MacOS, yet somehow made it's way into Windows.


You know he designed the horrible ios7 as well?

I don't think that's fully accurate, unless you have a link that confirms it? That Dye designed it, I mean, not that it was horrible...

Jony Ive was the head of design at that point (both hardware and UI). Wikipedia says Dye "contributed greatly to the design language of iOS 7" but Ive would have had final say. Certainly at the time as I recall it, iOS 7 was seen as Ive's baby.

Also, I'm not defending iOS 7, but I reckon its visual design was a lot more influential than it gets credit for. Think of those ubiquitous Prime bottles, with their bright saturated color gradients; the first place I remember seeing that style was iOS 7. I bet they picked that style for Prime because kids liked it, and kids liked it because kids like iPhones.

Edit to add: "bright saturated colors" goes back a long way to Fisher Price toys and the like, of course, but it's the gradients specifically that I think iOS 7 popularized.


I've heard rumors that part of why iOS 7 was so garish is because Dye's background was in product packaging so his team were doing design reviews on paper and didn't realize that the colors would look different on device due to CMYK vs RGB. Not sure if it's ever been confirmed but it would explain a lot.

I was in the room for a few design reviews for my part of iOS 7 (I was an engineer writing the new screens). Everything was done on a 90+ inch HDTV that we AirPlayed from our Macs or iPhones to for the room to view. Not printed, though the design studio walls were covered in printed explorations of variations of concepts, that is true.

Dye was the senior rep of the Design org present and commenting on all our software progress. I never once encountered Ive.


Thanks for the clarification! Out of curiosity, do you have any other insight into how/why iOS 7 turned out the way it did? What was the internal attitude towards it like?

I find it hard to believe that Dye would be so incompetent to not even know about CMYK vs RGB. How did he even get hired by Apple?

You mean something like HDMI? If you’ve ever tried to plug one of those into the back of a TV, you’ll know it’s still pretty difficult to get it the right way up.


> If you’ve ever tried to plug one of those into the back of a TV, you’ll know it’s still pretty difficult to get it the right way up.

That's true, but the difficulty in that case comes from being unable to see the hole or fit into the space between the television and the wall.

For example, plugging an HDMI cable into the back of a monitor involves none of the difficulty of plugging an HDMI cable into the back of a TV, even though the connector and the port are the same in both cases.


At least HDMI is a 'low frequency' connector, often only ever plugged in once, as opposed to USB (or refueling a car)


I bought a cheap USB hub so I don't have to reach behind the TV to plug things in


Glad to see Vince Guaraldi prominently mentioned here. Like the author, I got into Guaraldi via the Peanuts music, then found I loved the rest of his stuff as well.

I think Guaraldi is almost like a jazz version of Erik Satie, who’s been discussed here a few times. His music seems very simple, almost simplistic, but his taste and feel are superb. It’s just really good and easy to listen to, which unfortunately means it gets dismissed as “easy listening”.


Using "easy listening" as a pejorative has always baffled me. Why does music need to be difficult?


Easy listening implies that there’s not much of anything there. Nothing surprising or unique about the song or the performance. No insightful message and nothing worth reflecting on after.

I don’t think the alternative is “difficult” for its own sake. Rather, those who would use the term as a pejorative are likely seeking new experiences and viewpoints in their music and get bored by same old diatonic melodies over plain inoffensive grooves. Novelty is a source of dopamine for some.

A lot of jazz music is difficult to the untrained ear, and I have distinct memories of hearing albums that I now feel are too conservative but in my youth thought they were too chaotic. I now understand that it was never difficult from the performer’s perspective - just high level musicians playing the music they hear. I wish everyone could hear jazz just once through the ears of a jazz musician.


I think that playing any kind of live music requires a bit of a two-way accommodation between the needs of the audience and of the musicians. I don't think it needs to be difficult per se, but there needs to be something in it for the musicians.

This might sound self centered, which is a frequent stereotype leveled against jazz musicians, but on the other hand, why bother? There are other things we could be doing with our time. And I don't think that playing "difficult" music is incompatible with delivering a high quality performance, which is always my mission.


I think it’s worth distinguishing “difficult to perform” and “difficult to listen to”. Something like hard rock or metal with lots of flashy solos can be technically impressive, but it’s not difficult to “get” -- when done properly it just gets you in the gut.

The accusation usually levelled at cutting-edge jazz (fairly or unfairly) is that it’s so niche that it is difficult to get; that it’s left behind any pretence at being popular music. Many listeners would even go further and sneer “they’re just playing notes at random!” or “you’re just pretending to like it!”

I do wonder whether good-sounding, easy-to-get music is purely a matter of fashion (being just different enough to be interesting, but conventional enough to be accessible), or if to some degree there’s another axis of skill/difficulty in great pop music, of making it catchy and universal.


I think that since at least from the time jazz began to mature, like maybe in the 1940s, there has been a back-and-forth between crowd-pleasing and dance-able music, and more exploratory and artistic music. The Stan Kenton Orchestra traveled with two separate "books," one for dance gigs and another for concerts. Ellington's material, of which there was a lot, is quite imaginative.

To me that's OK. When jazz ceased to be responsible for forming the backbone of popular music, it triggered a more experimental period, including some ventures that were pretty far out, such as free jazz and free improv. Jazz also experienced a shift in focus -- not uncontroversially -- by becoming an object of academic study.

I think we're in a period right now when bands are seeking more audience friendly material. Now, the big-band I play in is in some sense "enthusiast" music. We have a small but loyal audience of people who happen to like this kind of stuff.

But in another of my bands, two of the players are actively composing new material, and it's arguably listen-able by any standards. Maybe we're in a third era, where we're free from responsibility for making popular music, but also free from responsibility for establishing the stature of jazz as a "serious" art form, and can return to the business of pleasing ourselves and our audiences.


IME it's basically synonymous with "muzak" and "smooth jazz", the kind of bland and mediocre background atmosphere inflicted on mall shoppers (often substituted with the same handful of mindless holiday tunes this time of year).


If it's not painful it's not good. If you're enjoying it you're doing it wrong.


>Using "easy listening" as a pejorative has always baffled me. Why does music need to be difficult?

Yes, I agree with you, it shouldn't and doesn't need to be.

But some things like music be it Jazz or something else isn't always just matter of listening but way of self establishment, way of life living or pursuing life, way how they seeing themselves and communicate themselves to others. I'm not in to this or studying this or anything else, but it's known behaviour model and you find studies if you like to read about it more.

Right, some Jazz aficionados tend to be like hipsters. Who despise and keep unorthodox anything but their likes would grok. A way of self establishment and having reason to keep themselves different. At least a bit better than others. I'm not claiming everybody are, but I certainly have met few of those quick to classify someone things they like.

I find my self like more West Coast Jazz bands and artists performances older I get. And if I'm not completely wrong it might be a more common trend their share has increased over the past ten or so years playing in radio stations too at least where I live.


Signed Distance Field


Have you tried Swift? It has the sort of pragmatic-but-safe-by-default approach you’re talking about.


Not enough to say yes in earnest. I help maintain some swift at work, but I put my face in the code base quite rarely. I've not authored anything significant in the language myself. What I have seen is some code where there are multiple different event/mutex/thread models all jumbled up, and I was simultaneously glad to see that was possible in a potentially clean way alongside at least the macos/ios runtime, but the code in question was also a confused mess around it and had a number of fairly serious and real concurrency issues with UB and data races that had gone uncaught and seemingly therefore not pointed out by the compiler or tools. I'd be curious to see a SOTA project with reasonable complexity.


And it is a joy to use, truly.


Race conditions, silent bugs, etc. can occur as the result of the compiler mangling your code thanks to UB, but so can crashes and a myriad of other things. [...] That's it. That's all there is to UB.

You don’t think that’s pretty bad?


They can also occur from defined behavior. The point being that they're completely besides one another.


The spacecraft is moving away from the sun at escape velocity. How is it going to launch anything backwards and have it make it all the way back to earth?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: