The backlash over some of the community's lack of enthusiasm for SwiftUI - mine included - was a lot milder than I thought it would be given the current trend of everything being absolutely the best thing that ever happened or the worst in history.
While that definitely surprised me in a positive way, it also made me think about the broader topic of the over-abundance of hyperbolae (or hyperboles if you really insist) in our field.
The need to generate excitement over something that is fundamentally boring information manipulation science drives me fairly bonkers, and in my opinion has some very bad side effects.
Here's a few headlines in my newsfeed (pertaining to CS)
- "Nokia reveals 5G-ready lithium nanotube battery with 2.5X run time"
[at the end of the article]
As is commonly the case with new battery technologies, the researchers are providing no specific timetable for commercialization.
- "AI was everywhere in 2018 and it will continue to be a major topic in 2019 as we begin to witness AI breakthroughs across businesses and society"
No we don't. "AI" doesn't exist, and machine learning algorithms are the same as they were 20 years ago.
- "Flutter will change everything, and Apple won't do anything about it"
Yea, well... I guess predictions aren't that guy's forte.
- "Apple's AR glasses arriving in 2020, iPhone will do most of the work"
[just below the title, emphasis mine]
Apple's long-rumored augmented reality headset could arrive mid-way through 2020 prominent analyst Ming-Chi Kuo believes
And that's just stuff people have thrown at me in the past few days... You can add quantum computing, superfine processor lithography, AR/VR news, etc if you feel like it. I will spare you the most outrageous ones.
Look, I get it. Websites that are paid through advertising need to generate traffic and they will use every single clickbait they can find.
My problem is that people that are supposed to be professionals in my field are heavily influenced by those headlines and by, well, influencers, who hype things. It's like everyone, including people who are supposed to actually implement those things for actual paying customers, are succumbing to a mass hysteria. No wonder clients are so harsh and sometimes downright hostile when it comes to evaluating the quality of the work done.
Because "AI" is so hyped up these days, I've had people refusing to pay me for ML work, because the percentage of the predictions were "too low", based on a fluff piece Google had posted somewhere on the theoretical capabilities of their future product that will do better... As if an on-device computationally expensive model built by one guy could outperform a theoretical multi-million cloud-based computer farm, on which a couple of hundred techs will have worked on for a few years...
The five star problem
By over-hyping everything, we end up in a situation where something that's good is a "four stars", something that's good with a very good (and expensive) marketing campaign and tech support mayyyyyyy get a "four and a half", and everything else is a "one star".
Is a new piece of tech like SwiftUI good? No one knows, because we haven't used it in production yet. It's interesting, for sure. It seems to be fairly performant and well-written. Does it have limitations? Of course it does. Will it serve every single purpose? Of course not. Why can't we be interested in a mild manner, recognizing the positive and the drawbacks?
Is ML a useful tool? Yes indeed. To the point where we can build useful stuff with it anyways... Is it going to replace a smart human anytime soon? Of course not.
Lately, it seems like I have to remind people all the time that computer science is a science. Is chemistry cool? Of course it is! Do you see chemists run around everywhere clamoring that they have found a new molecule that will save humanity every goddamn week? No, because despite the advances in AI (without quotes, it means something fairly different than "machine learning"), we still haven't found a way to predict the future.