Apple is facing calls to beef up enforcement against fake reviews and rating scams after a developer took to social media to shine a light on unfair practices he’s forced to compete with as a result of fraudulent activity on the App Store not being rooted out by the tech giant.
Kosta Eleftheriou, one of the founders of the Fleksy keyboard app (who was acquihired by Pinterest in 2016), has — since March 2018 — been applying his expertise in autocorrect algorithms to make typing on the Apple Watch’s tiny screen not only possible but “simple, enjoyable and highly effective”, as Forbes’ reviewer put it.
His app, FlickType, has also been described by app reviewers as “astonishingly accurate”, a “fundamentally better keyboard” and “way faster” than the letter-by-letter scribble method Apple supports natively.
User reviews also include a large amount of glowing five-star ratings. The overall rating from users currently is 3.5 because a number of lower scores have pulled down the average. But if you take the time to dig in the developer can be seen responding consistently and constructively to issues being raised by users who leave lower scores.
Sometimes complaints are related to Watch platform issues outside his control (as Apple limits how third party text input can be accessed). Missing features are another common issue — and in many responses Eleftheriou responds by saying he’s added the setting the person was after (such as the ability to disable Auto-Correction) or highlighting a “brand new look & feel to make typing even easier”. Other times he thanks users for raising bugs that he says have now been fixed.
Anyone reading how specifically each complaint is addressed would be confident the developer of FlickType is working hard to make sure the app meets customers expectations. Even though the overall rating means other Watch keyboard apps are ‘rated’ higher overall.
The problem for Eleftheriou is all his genuine hard work is being undercut by copycat app makers who are able to leverage weak App Store enforcement to profit unfairly and at his expense.
The scam goes like this: A bunch of Watch keyboard apps are published that purport to have the same slick features as FlickType but instead lock users into paying eye-wateringly high subscription fees for what is, at best, a pale imitation.
You might expect quality to float to the top of the App Store but the trick is sustained by the clones being accompanied by scores of fake reviews/ratings which crowd out any genuine crowdsourced assessment of what’s being sold.
Fake reviews outnumber the real deal. It’s only if you take the time to read through the comments that alarm bells might start ringing…
“Wish I read the reviews before buying. I can’t even get it to work on my watch,” runs a one-star review of WatchKey, one of the rival apps Eleftheriou has complained about — which nonetheless has a higher overall rating than his app owing to also having a very large proportion of five-star reviews.
“We are so sorry for any inconvenienced caused. Please kindly email us to describe more about your scenario so that we can support you as soon as possible,” is WatchKey’s generic response to the one-star review.
“Terrible,” writes another one-star reviewer. “I bought this app to use T9 on my watch. I haven’t been able to get T9 to work on my watch, I’ve also reached out to the customer service email that’s listed on the app. But I haven’t gotten a response, I would advise to find a different app.”
WatchKey’s response to another abysmal verdict on its software? More platitudes: “Thank you for your feedback. Unfortunately, we haven’t received your email yet. Please kindly email us once more via email@example.com to describe more your scenario so we can support you as soon as possible.”
The pattern repeats across negative reviews. Even one of the ‘five’ star reviews warns: “You need to pay if you want to use the T9. They make you write a review to ‘unlock’ and then they ask for a payment.”
One component of the manipulation involves posting generic platitudes to do the bare minimum required by Apple to manage (genuine) negative reviews. The other is flooding listings with fake five star reviews to ensure the app’s overall rating remains high. Step 3: Profit.
Eleftheriou’s Twitter thread highlights some of what he says are “hundreds” of fake five star reviews which are being used to drive Watch owners toward downloading the malicious clones — using wording that refers to non-existent features or references things you’d be doing on other types of devices (suggesting the text may have been cut and pasted from genuine reviews elsewhere).
A quick Google search for ‘buy ios reviews’ returns a staggering 643M results — including ads for companies touting “app reviews, installs and ratings [as] the best way to improve the rank of your apps at Appstore and Google Play” and selling “high quality iOS app reviews with ratings for $2.5… from 100% Real Users”.
Clearly selling fake reviews is a booming business — which in turn speaks to the woeful lack of effective enforcement.
In an extra fake kicker, Eleftheriou found that one of the scammy competitors had even ripped off his own app promo video — which was demoing the features offered by FlickType — and used it in ads targeting app consumers on Facebook and Instagram.
Facebook does have policies against third-party infringement (under section 4 of its prohibited content policy) — but you might as well whistle for pro-active enforcement from the adtech giant. It only acts when it gets a complaint of infringement so preventing abuse of his marketing materials would require Eleftheriou to spend even more of his time hunting for and reporting the malicious ads ripping off his stuff. (“I did report and Facebook did eventually take it down. But… I knew this was not going to be any sort of lasting relief,” he confirms.)
Of course the really big kicker here is that Apple’s rules for developers clearly stipulate that submitting fraudulent reviews is a violation of the developer program licence agreement.
Its App Store review guidelines also warn that developers who attempt to cheat the system (such as by manipulating ratings) may only have their apps removed from the App Store — and could be expelled from Apple’s developer program entirely.
So — to put it politely — it’s not a good look for Apple that an indie developer with proven expertise and reputation is having to spend so much resource fighting App Store scams because its own enforcement has failed to stamp them out. To the point where he feels the only path forward is to resort to a public call out on social media to highlight systematic enforcement failures.
Eleftheriou tells TechCrunch he decided to raise the complaint on social media after what he describes as “simply depressing results” from engaging with Apple’s official ‘app dispute’ channel.
“They put you in contact with the other developer in question, and oversee the thread while they hope you will resolve the issue with the other party directly,” he explains. “The scammers I complained about in that dispute weren’t even the bigger scammers I mention in my Twitter thread. Yet, the complaint I had with them barely got addressed, and there was no response from Apple whatsoever on the issue of the fake ratings and reviews. Simply a ‘if we don’t hear back from you very soon we consider the matter resolved’. We even reached out to Apple privately after that but got no response.”
“What was most impressive to me, was that in the presence of the Apple legal team, the scammers did not feel threatened one bit — almost as if they know Apple is unlikely to do anything,” he adds. “In my view, Apple simply does not devote enough resources on this area.”
Since raising the issue on Twitter, Eleftheriou has reported a partial win — in that some of the apps he had complained about have been taken down from the App Store. (At the time of writing Apple has not made any public statement confirming any action.)
However the developer accounts do not appear to have been banned at this time. “It’s astounding that even pulling a scam like that, doesn’t get your developer account revoked!” Eleftheriou told us. “I mean if that didn’t do it, what would??”
We reached out to Apple about this issue and it provided some background information related to its developer policies — which forbid attempts to cheat the system (such as by trying to trick the review process, steal user data, copy another developer’s work, or manipulate ratings or App Store discovery), among other relevant provisions.
We also asked Apple if it’s considering any policy changes in light of the issues raised by Eleftheriou — and will update this post with any response.
“The main issue in my view is not the cloning here. I didn’t even care that they were using my name, or made their screenshots similar to mine etc. If only there was a system to better prevent fake ratings and reviews, none if this would matter,” Eleftheriou also told us. “People would be able to collectively protect themselves through their 1-star ‘votes’ but when that system is allowed to get rigged, everything else goes out the window.
“The promise of ratings and reviews you can trust does not exist any more which erodes consumer trust at an ever accelerating pace,” he adds. “I did a Google search to see what those ‘companies’ look like, if you want to buy ratings and reviews. These are proper, full blown companies, with support systems, and claims that their ratings won’t get deleted by Apple, unlike their competitors. It was shocking to see that this is an industry that is thriving.”
The issue of fake reviews certainly goes far beyond Apple’s App Store. And is a very insidious one.
Fake reviews are pretty much a universal experience across the Internet — whether you’re trying to buy stuff on Amazon, looking at places to visit on Tripadvisor or trying to find a local dentist with the help of reviews on Google Maps (in short; don’t) — given how many platforms now incorporate user reviews.
But the issue does look especially toxic for Apple.
A core part of the USP for its App Store is the claim that Apple’s review process sums to a higher quality, more trustworthy experience than alternative marketplaces that aren’t so carefully overseen.
So a failure to do more to enforce against review scams and rating manipulations risks taking a lot more shine off Apple’s brand than Cupertino should be comfortable with.
Simply put: Consumers expect a higher standard from Apple. That’s why they’re willing to pay a premium for its products. Under-resourcing App Store review and enforcement thus looks like a false economy — not least because it risks driving quality developers like Eleftheriou away.
If a developer with so much pedigree can’t reliably sell his wares on the App Store what does that say about Apple’s ‘premium’ marketplace?
The issue is also likely to be increasingly on the radar of consumer watchdogs and regulators in the coming years. The European Union, for example, is planning to bake binding transparency and reporting requirements into incoming platform regulations — as it seeks to promote fairness and accountability in digital businesses.
While an EU Omnibus Directive that came into force at the start of last year — with a two year deadline for Member States to transpose it — aims to beef up consumer rights through enhanced enforcement and transparency requirements — including directly addressing the issue of fake reviews by placing an obligation on traders to take ‘reasonable and proportionate’ steps to ensure reviews are genuine, among other measures.
In the EU platforms will therefore start being required to ‘justify’ their enforcement failures vis-a-vis fake reviews. And if they can’t, well, the regime includes tough ‘GDPR-level’ fines for breaches of consumer protection law. So the costs won’t only be reputational, as currently.
The UK’s Competition and Markets Authority, meanwhile, has also been cracking down on the trade in fake reviews — specifically targeting Facebook, Instagram and eBay in recent years. Further attention to the issue from UK oversight bodies, which are now operating independently of the EU, also seems likely.