Mobile apps are still horrible – that’s the scary conclusion of researchers. They tested a number of Android apps and easily found 23 that filtered the personal data of 100 million users, and even worse.
Aside from the obvious lessons for developers, there are also things to think about IT people. It may seem that mobile device management (MDM) has gone out of style, but today it is so necessary, perhaps even more so with employees working remotely.
And it’s not just an Android issue. At this week’s Security Blogwatch, we’re back to school.
Your humble blogwatcher has cured these tickets for your entertainment. Not to mention: entire overflow volcanoes.
10,000 feet display: Depression
What is craic? Catalin Cimpanu sounds deeply frustrated: a handful of Android applications exposed the data of more than 100 million users:
Mobile app developers continue to expose their users’ personal information through misconceptions in a simple and nasty way … of third-party cloud services. [Researchers] 23 Android applications were found exposing the personal data of more than 100 million users.
[For example] developers who forgot to password protect their backend databases [or] who left passwords / passwords within the source code of their mobile app for services such as cloud storage or push notifications. … Reports have been released on mobile apps that expose user data leaving the backend infrastructure exposed online [for] half a decade. [But] the problem has continued to persist, mainly due to poor coding practices.
Literally nothing [changes] despite repeated warnings.
And here it is Tara Seals, with a kiss: rampant cloud leaks:
The depth of data at risk in applications is such that a series of attacks could be possible, from the use of credentials against other accounts to social engineering and identity theft or fraud. … Wrong cloud settings that leave publicly exposed data happen all the time … and unfortunately end users can do very little to protect themselves.
The data was accessible from real-time databases of 13 of the Android applications. … However, for the applications examined, there were no authentication checks to access them. … In the case of at least two of the applications, the cloud keys were exposed without any safeguards. … Automatic notification managers for many of the applications were also not password protected. … This could be put together in ingenious ways.
If you choose to use cloud storage as a developer, you must ensure that the key material needed to connect to this storage remains secure, and you must also take advantage of the access control and encryption mechanisms of the cloud. cloud provider to keep data protected. Mobile application developers should make use of the Android Keystore and Keychain mechanisms that are supported by the hardware security module of the mobile device. [and] make use of Android’s encryption mechanisms when storing other sensitive customer data.
Who discovered this last wall of shame? Aviran Hazum of Check Point, Aviad Danin, Bogdan Melnykov, Dana Tsymberg and Israel Wernik: tagging teams: incorrect configuration of third-party services by mobile app developers:
Services such as cloud-based storage, real-time databases, notification management, analytics, and more are just a click away from integrating into applications. However, developers often overlook the security aspect of these services, their configuration and, of course, their content. … Incorrect settings [puts] user personal data and the developer’s internal resources, such as access to update mechanisms and compromised storage.
This misconfiguration of real-time databases is not new, but to our surprise, the scope of the problem is still too broad and affects millions of users. … There was nothing on the site that would stop unauthorized access. … We were able to retrieve a lot of confidential information, including email addresses, passwords, private chats, device location, user IDs, and more. If a malicious actor accesses it [to] This data could result in a service coup (i.e., attempting to use the same username-password combination in other services), fraud, and identity theft.
[For example] through “T’Leva”, a taxi application with more than 50,000 facilities, we were able to access chat messages between drivers and passengers and retrieve the full names, phone numbers and locations of users (destination and collection). [Another] The “iFax” application not only had the cloud storage keys embedded in the application, but also stored all fax transmissions [so] a malicious actor could access all documents sent by more than 500,000 users. … In addition, most applications [also had database] write permissions.
Most push notification services require a key … to identify them [the] sender. What happens when these keys are just embedded in the application’s own file? … Imagine if a news outlet application sent its users a fake news entry notification that directed them to a fishing page.
Wow. Just wow. And the showcecuck “politely” agrees:
I worked at a company that made mobile apps. … Bad coding practices … are universal.
I’ve seen the supposed top developers leave the default passwords in the back bins afterwards [complaining] that they should be allowed to manage their own development servers. And then they expose the servers to the world using the rules of the firewall and open huge security holes.
These idiots work on all stacks of technology.
What other research can we find? David O’Brien adds his top three failures to the cloud: most common incorrect cloud configurations:
First place: … Azure Storage / AWS S3 Buckets accounts do not apply HTTPS. Definitely, accessing / copying data through unencrypted channels is not recommended and a clear path to filtering data in places should not be filtered. … Microsoft, for example, now sets this property by default. AWS does not apply HTTPS by default.
Second place: … The features of Azure App Services / AWS Lambda should not be publicly accessible (many of which we have found also have clear text secrets …).
Third place: … Azure Network Security Groups (NSG) / AWS Security Groups (SG) / GCP Firewalls that allow access to management ports from the Internet. … This means that a cloud-based “firewall” is configured to allow traffic as a Remote Desktop Protocol (RDP) or SSH incoming from the Internet. A very common path for attackers on cloud-hosted virtual machines.
O RLY? I’m not sure if QuinnyPig agrees or not:
“The wrong number one cloud setup is that S3 cubes don’t apply HTTPS” is a take.
But it’s definitely a developer issue, isn’t Android an issue? ArmoredDragon agrees:
Something tells me that this is also an iOS issue, although the fact that iOS is notoriously difficult to audit makes it less likely that a third party will be able to detect something like this. Apple is already well known for letting scamming apps pass directly through its censors; something tells me that a misconfigured cloud storage, especially an apple that has no control, would outperform its censors.
Maybe so, but Android users have other issues to deal with. Dan Goodin explains: 4 vulnerabilities attacked give hackers full control of Android devices:
Unknown hackers have been exploiting four Android vulnerabilities that allow the execution of malicious code that can completely control devices, Google warned. … The four vulnerabilities were revealed two weeks ago.
Two of the vulnerabilities are in Qualcomm’s Snapdragon [SoC], which feeds … a large number of phones. … So far, there have been four zero-day Android vulnerabilities revealed this year, compared to one throughout 2020.
Google has released security updates to device manufacturers, who are responsible for distributing patches. … Google representatives did not respond to emails asking how users can know if they have been targeted. … Without more actionable information from Google, it’s impossible to provide useful tips to Android users.
What is the solution? Here’s a nasty couchslug suggestion:
Android has no hope from a security perspective … and it should be left to those who bother with security. It is important to understand what cannot be had to make informed decisions.
Pure FOSS phones would be catered to by more informed users, but there won’t be many for at least a decade because this problem is extremely difficult to solve. … Of course, the problem will never be taken seriously … because revenue is why companies exist and security is an awkward cost center.
I RoninX strongly agrees:
Google should provide Android users with a way to download and install security patches without waiting for their OEM. … I know Google has moved architecturally in that direction, but they need to move farther and faster.
If there is a vulnerability in all instances of certain versions of Android, it should be sticky in all those instances. Similarly, if the fault is related to a common hardware component, such as a Qualcomm Snapdragon SoC, it should be possible to release a patch that runs on all devices.
In the meantime, a poetic gweihir simply rolls his eyes:
“App” rhymes with “shit.” … The whole idea was from the beginning that semi-competent and incompetent people would write a bunch of apps. Some seem to be well written and therefore actually used. [This] it is only a completely predictable side effect.
The moral of the story?
TI: BYOD MDM might not be in vogue, but it could be CYA.
Developer: Don’t, obvs. kthxbai.
Are you very embarrassed to ask? Marcus “MalwareTech” Hutchins to the rescue:
Prior to “And Finally”
You’ve been reading Security Blogwatch by Richi Jennings. Richi organizes the best blogs, best forums and weirdest websites … so no need to. Hate mail can be directed to @RiCHi or firstname.lastname@example.org. Ask your doctor before reading. Your mileage may vary. E&OE. 30.
This week’s Zomgsauce: Inbal Marilli (via Unsplash)