This article analyzes nearly a decade of Android CVEs, revealing that vulnerabilities often take 3–5 years to fully resolve. Native system components and kernel code account for most fixes, while human history, code complexity, and modification patterns are key predictors of vulnerability-inducing changes.This article analyzes nearly a decade of Android CVEs, revealing that vulnerabilities often take 3–5 years to fully resolve. Native system components and kernel code account for most fixes, while human history, code complexity, and modification patterns are key predictors of vulnerability-inducing changes.

Study Shows Android Vulnerabilities Can Take Up to 5 Years to Fully Fix

13 min read

ABSTRACT

I. INTRODUCTION

II. BACKGROUND

III. DESIGN

  • DEFINITIONS
  • DESIGN GOALS
  • FRAMEWORK
  • EXTENSIONS

IV. MODELING

  • CLASSIFIERS
  • FEATURES

V. DATA COLLECTION

VI. CHARACTERIZATION

  • VULNERABILITY FIXING LATENCY
  • ANALYSIS OF VULNERABILITY FIXING CHANGES
  • ANALYSIS OF VULNERABILITY-INDUCING CHANGES

VII. RESULT

  • N-FOLD VALIDATION
  • EVALUATION USING ONLINE DEPLOYMENT MODE

VIII. DISCUSSION

  • IMPLICATIONS ON MULTI-PROJECTS
  • IMPLICATIONS ON ANDROID SECURITY WORKS
  • THREATS TO VALIDITY
  • ALTERNATIVE APPROACHES

IX. RELATED WORK

CONCLUSION AND REFERENCES

\ \

VI. CHARACTERIZATION

This section characterizes the collected vulnerability data. We note that Subsection VI.A utilizes all CVEs published in the Android Security Bulletins (ASB) from August 2015 to December 2023; Subsection VI.B utilizes CVEs published on ASB in the first year (from August 2015 to July 2016), and Subsection VI.C focuses on the CVEs found in the framework/av project of AOSP.

\ A. VULNERABILITY FIXING LATENCY

Let us first analyze the time taken to detect and fix vulnerabilities in AOSP. Specifically, the number of days between each vulnerability-inducing release and its corresponding vulnerability-fixing release is measured. Figure 3 shows the results for each AOSP version (shown in the legend).

\ For a majority of the AOSP versions, the measured vulnerability fixing latency peaks between 1,000 and 1,300 days (i.e., 3–4 years). The exception is seen in the recent releases (e.g., Android 13 and 14 released in <2 years) where the latency is also less than 2 years12 . The tail is also long. For example, some vulnerabilities introduced in the two AOSP releases (e.g., v8.1) take over 4 years (>1,450 days) to be fixed.

\ While Figure 3 captures the time between vulnerabilityinducing and fixing releases, it presents a conservative view. It excludes the time from the submission of a ViC to its corresponding AOSP release which is about a half year on average. Similarly, it does not include the time from a fixed AOSP release to OEM device updates [41][42]. Consequently, the true latency from ViC submissions to VfC rollouts to the user devices is longer (e.g., ~5 years instead of 4 years in Figure 3). Additionally, since the security update support window of the Pixel devices by an Android OEM is recently extended to 7 years in 2023 from the previous 5 years, the true latency for the older releases with the shorter support window could be longer than the data shown in Figure 3.

\ We note that the vulnerability fixing latency distribution is accurate for each AOSP dessert release version. However, it does not directly show the vulnerability fixing latency distribution of Android OEM devices in the field. It is because Android OEM devices are usually upgraded to newer Android dessert releases thanks to the fast software update efforts (e.g., TREBLE [37]) since Android 8.1. To show how to estimate the vulnerability fixing latency for OEM devices, let us consider an OEM device launched with Android 9.0, upgraded to Android 10 after one year, and upgraded to Android 11 after another year before reaching its End of Life (EoL). The vulnerability fixing latency for that OEM device can be calculated by concatenating: (1) the first year of vulnerability fixing latency data for Android 9.0;

(2) the first year of data for Android 10; and (3) the entire vulnerability fixing latency distribution for Android 11.

\ B. ANALYSIS OF VULNERABILITY FIXING CHANGES

Given the observation that AOSP vulnerabilities can take over 4 years to fix, this analysis uses vulnerabilities fixed and published in the AOSP security bulletins during the first year (from August 2015 to July 2016). Those vulnerabilities are mostly found in the Android 4.1–6.0 releases, namely, Jelly Bean, KitKat, Lollipop, and Marshmallow.

\ Vulnerability Fix Rate. Over the analyzed one year, 356 CVEs are fixed, averaging approximately 0.975 (≈ 1) CVE fixes per day. However, relatively large variations are seen in this rate across the 12 months as shown in Figure 4. It shows how the vulnerabilities fix pattern changed over the one year period. The CVE fix pattern shifts noticeably, with a sharp increase in the number of fixes during the final four months of the analyzed release period. The surge aligns with approaching yearly AOSP and Pixel device releases.

\ The seasonal pattern reflects the increasing focus on the security and stress testing as it gets close to the yearly release deadlines. Specifically, the emphasis during the initial months was on hardening the media and codec components of the Android native system. With the Android 7.0 (Nougat) release nearing, additional triggers were added to find the upstream Linux kernel vulnerabilities. Such shifts in testing focus are common during a software release lifecycle. Limited testing resources must be strategically allocated in accordance with development progress in order to ensure the quality, security, and other system integration requirements.

\ Vulnerability Severity Distribution. The severity data of the addressed CVEs reveals the importance of those fixes. About 82.9% (i.e., 32.9% critical and 50% high) of the fixed CVEs is categorized as critical or high. Here, critical or high means the fixes are promptly created and integrated into the main and all the backport branches for monthly releases, expediting the fix rollouts compared to a annual update cycle from the main branch for the moderate or low severity issues. 15.7% of the same is classified as ‘moderate’ and only 1.4% is classified as ‘low’ or ‘none’.

\ Code Fixes for Vulnerabilities. There is a many-to-many relationship between the CVEs and their code fixes.

1-to-1 relationship. Typically, a single CVE issue fix is done by a single code change (e.g., git commit).

1-to-M relationship. Some CVE fixes require multiple code changes. For the analysis purpose, code changes addressing the same CVE issue are grouped together if the changes are in a single git project. It reflects the observed common practice of developers splitting large fixes into smaller, more manageable code changes. Additionally, a code change related to deploying a fixed kernel image (e.g., to drop a rebuilt image to an Android repository) is considered part of the initial code change in a kernel code repository, as the change for a kernel image deployment stems from the initial source code change.

N-to-1 relationship. Conversely, a single code change can sometimes resolve multiple CVEs. This is seen in cases of redundant CVEs for the same vulnerability or when multiple CVEs share a common root cause. Another example of an N-to-1 relationship is for when related CVEs exist for each affected device type (or chipset). Similar code changes applied to different device-specific branches are grouped together, including non-trivially cherry-picked changes with minor device- or chipset-specific adjustments. These semantically similar code changes are considered a single fix.

N-to-M relationship. While it is rare, fixing CVEs with a seemingly N-to-1 relationship can sometimes involve more than one code changes. If distinct code changes remain across multiple system abstraction layers after the fore described grouping practices, the layer containing the most significant fix is prioritized for analysis.

\ Abstraction Layers of Code Changes. Figure 4 reveals the distribution of the first year AOSP CVE fixes across the system abstraction layers (or software subsystem-component types). Initially and consistently, many CVEs are addressed in the Android system layer13 (such as the native servers, Hardware Abstract Layer modules, and Native Development Kit libraries [37]). Notably, the final quarter saw a significant increase in CVE fixes with the kernel layer.

\ Among the CVE fixes, nearly half (46.7%) target the Android system. A significant portion (33.5%) addresses the Linux kernel, while firmware fixes (such as bootloader) make up 4.2%. The remaining 19.8% is distributed as follows: Android app (~3.1%), Android Java framework (9.3%), other non-native code (5.1%), and configurations such as the SELinux policy, kernel config, init run command, and Android build rule (2.3%).

\ The Android native software components are about 5.8 times more likely to contain the CVEs compared to the Android Java programs and configurations. Table II shows the system and kernel projects with the most CVE fixes. The higher security of Java code stems from the two factors: the app store inspection process for Android apps and the inherent security benefits of type-safe Java and Kotlin programming languages used by Android apps and the Android framework. Here, the native software components are often developed by third-party contributors and other open source communities (e.g., GitHub and Linux kernel).

\ However, vulnerabilities in the native code pose a significant security threat due to the powerful system privileges their attackers can exploit. Those low-level attacks can, in theory, subvert any overlying software running on top of the target layer and often do not require any user actions (e.g., app installation) to be triggered them. For example, it is possible to remotely exploit a system-level vulnerability through an MMS (Multimedia Messaging Service) message, even if the message is never opened by device users. As a result, it is often difficult to detect such system layer attacks.

\ Table II details the distribution of the system and kernel CVE fixes across their projects. The top five system projects in the table account for 67.9% of the first year, system-layer fixes. Notably, the framework/av project encompasses ~46% of the system-layer fixes, demonstrating the highest sample density. Within the kernel itself, drivers lead the pack with 72.5% of the CVE fixes, followed by the architecturespecific code (arch) at 8.8%, file system (fs) at 5.9%, and sound related code at 6.9%. The top 10 kernel projects encompass 51% of the first year, kernel CVE fixes.

\ C. ANALYSIS OF VULNERABILITY-INDUCING CHANGES

For characterizing ViCs, let us focus on the CVEs fixed within the AOSP framework/av project, which exhibits the highest fixed vulnerability density. The project is a valuable target for in-depth ViC analysis due to the extensive testing (including fuzzing), the security hardening efforts in the Android Nougat release (e.g., vulnerabilities fixes), and its large size (e.g., 3,513 non-hidden files and directories, comprising 254,899 lines of C/C++ source code, configs, documents, and build rules).

Fixing a single CVE issue can involve several VfCs. Of the 359 fixed CVEs analyzed, 77 require multiple VfCs that are merged into the target project. In total, those 77 CVEs are associated with 354 VfCs. Further analysis, using district code change identifiers, uncovers 244 unique VfCs. Our toolset then employs those unique VfCs to identify a total of 551 ViCs, which are subsequently characterized using our classification feature data types.

\ Table III summarizes the initial evaluation results for each feature set using a decision tree classifier. For example, the third column shows how many LNCs the VP framework predicts as ViCs. Notably, the HH (Human History) and VH (Vulnerability History) feature sets achieve high accuracy in ViC identification. Conversely, neither the HP (Human Profile) nor PP feature sets detect any ViCs, while the remaining feature sets exhibit varying accuracy levels.

\ Figure 5 visually analyzes feature values to provide deeper insights into the effectiveness of different feature sets. It shows the distribution of feature values for both ViCs (red symbols, upper row) and LNCs (blue symbols, lower row). The x-axis represents the value range of each specific feature data type. The visualization reveals patterns explaining why certain feature sets perform better than others in predicting ViCs.

\ The HP feature set shows limited effectiveness in AOSP because it relies on two discrete features. ViCs tend to cluster within a narrower range of those feature values compared to LNCs (e.g., ViCs utilize only one value of the HPauthor feature). The limited value distribution likely stems from the target project development being primarily handled by a single organization, fostering consistent coding practices within the AOSP framework codebase. Consequently, code change author affiliation is not a strong predictor of vulnerabilities within AOSP.

\ The initial hypothesis that malicious external contributors were a primary source of vulnerabilities proves incorrect in Android platform developments. The data analysis reveals that most ViC authors are not malicious third-party actors. It

\ is likely due to the rigorous collaboration process in place for external contributions to AOSP: such contributors usually lack direct commit permissions, and their code changes can sometimes undergo extensive scrutiny by the project owners. Thus, the observation is AOSP vulnerabilities are more likely to arise when both authors and reviewers are trusted entities and consequently there is reduced inspection and testing thoroughness.

\ The CC (Change Complexity) feature set reveals a pattern. Most ViCs involve small- or medium-sized code changes, while LNCs exhibit a wider range of sizes, encompassing both tiny and extra-large code changes. It suggests that code modifications exceeding a certain size threshold (e.g., >250 lines) would introduce enough complexity to distract both authors and reviewers, increasing the likelihood of undetected vulnerabilities. However, some extremely large code changes often involve repetitive or mechanical edits (e.g., pattern-based refactoring or removing deprecated code) rather than modifications to intricate logic, making them less prone to oversights. Interestingly, the CCrevision feature also indicates that ViCs typically undergo fewer revisions during their code reviews compared to LNCs. The observation supports the idea that some LNCs may initially contain vulnerabilities that are addressed through the code review process, leading to more revisions.

\ The HH (Human History) feature set confirms a trend. In general, authors and reviewers previously involved in ViCs are more likely to be associated with the introduction of new ViCs. This pattern is evident in Figures 5 (HHauthor and HHreviewer sub-graphs), where ViCs exhibit high-density clusters slightly to the right of LNC value clusters. The sparse distribution on the left side of the upper row (representing individuals with only one ViC at the time of analysis) is likely to converge towards the right side cluster for ViCs over time. This finding highlights the importance of identifying ViCs and providing early, targeted feedback to the involved software engineers. Such feedback can improve their understanding of vulnerabilities, aiding prevention efforts in the near future.

\ The VH (Vulnerability History) feature set indicates ViCs and LNCs generally modify a similar set of files. However, some ViCs introduce changes to previously untouched files. Such modifications on untouched files consistently result in vulnerabilities in the analyzed dataset. It can be explained by the two scenarios: a newly created file is modified for the first time, introducing a ViC, or a file undergoes multiple local edits that are later combined (e.g., using git squash mechanism) into a commit (ViC) visible on the main repository. The practice of infrequently upstreaming large, merged changes potentially increases the risk of vulnerabilities.

\ This paper prioritizes characterization of impactful feature sets. Other individual feature sets are omitted due to redundancy with the fore described characteristics or the lack of clear patterns in their visualizations in Figure 5. It, at the same time, underscores the importance of multivariate analysis, as demonstrated in Figure 6. Here, specific combinations of two features (i.e., 5 pairs in total) yield relatively effective classifiers with clear clustering patterns (or hyperplanes) in the two-dimensional space. Analyzing only single features or pairs would provide an incomplete understanding of the true potential of the entire feature sets, given the numerous informative combinations possible. Thus, a comprehensive evaluation study is crucial.

:::info Author:

  1. Keun Soo Yim

:::

:::info This paper is available on arxiv under CC by 4.0 Deed (Attribution 4.0 International) license.

:::

\

Market Opportunity
Overtake Logo
Overtake Price(TAKE)
$0.02588
$0.02588$0.02588
-1.89%
USD
Overtake (TAKE) Live Price Chart
Disclaimer: The articles reposted on this site are sourced from public platforms and are provided for informational purposes only. They do not necessarily reflect the views of MEXC. All rights remain with the original authors. If you believe any content infringes on third-party rights, please contact [email protected] for removal. MEXC makes no guarantees regarding the accuracy, completeness, or timeliness of the content and is not responsible for any actions taken based on the information provided. The content does not constitute financial, legal, or other professional advice, nor should it be considered a recommendation or endorsement by MEXC.

You May Also Like

Botanix launches stBTC to deliver Bitcoin-native yield

Botanix launches stBTC to deliver Bitcoin-native yield

The post Botanix launches stBTC to deliver Bitcoin-native yield appeared on BitcoinEthereumNews.com. Botanix Labs has launched stBTC, a liquid staking token designed to turn Bitcoin into a yield-bearing asset by redistributing network gas fees directly to users. The protocol will begin yield accrual later this week, with its Genesis Vault scheduled to open on Sept. 25, capped at 50 BTC. The initiative marks one of the first attempts to generate Bitcoin-native yield without relying on inflationary token models or centralized custodians. stBTC works by allowing users to deposit Bitcoin into Botanix’s permissionless smart contract, receiving stBTC tokens that represent their share of the staking vault. As transactions occur, 50% of Botanix network gas fees, paid in BTC, flow back to stBTC holders. Over time, the value of stBTC increases relative to BTC, enabling users to redeem their original deposit plus yield. Botanix estimates early returns could reach 20–50% annually before stabilizing around 6–8%, a level similar to Ethereum staking but fully denominated in Bitcoin. Botanix says that security audits have been completed by Spearbit and Sigma Prime, and the protocol is built on the EIP-4626 vault standard, which also underpins Ethereum-based staking products. The company’s Spiderchain architecture, operated by 16 independent entities including Galaxy, Alchemy, and Fireblocks, secures the network. If adoption grows, Botanix argues the system could make Bitcoin a productive, composable asset for decentralized finance, while reinforcing network consensus. This is a developing story. This article was generated with the assistance of AI and reviewed by editor Jeffrey Albus before publication. Get the news in your inbox. Explore Blockworks newsletters: Source: https://blockworks.co/news/botanix-launches-stbtc
Share
BitcoinEthereumNews2025/09/18 02:37
Unprecedented Surge: Gold Price Hits Astounding New Record High

Unprecedented Surge: Gold Price Hits Astounding New Record High

BitcoinWorld Unprecedented Surge: Gold Price Hits Astounding New Record High While the world often buzzes with the latest movements in Bitcoin and altcoins, a traditional asset has quietly but powerfully commanded attention: gold. This week, the gold price has once again made headlines, touching an astounding new record high of $3,704 per ounce. This significant milestone reminds investors, both traditional and those deep in the crypto space, of gold’s enduring appeal as a store of value and a hedge against uncertainty. What’s Driving the Record Gold Price Surge? The recent ascent of the gold price to unprecedented levels is not a random event. Several powerful macroeconomic forces are converging, creating a perfect storm for the precious metal. Geopolitical Tensions: Escalating conflicts and global instability often drive investors towards safe-haven assets. Gold, with its long history of retaining value during crises, becomes a preferred choice. Inflation Concerns: Persistent inflation in major economies erodes the purchasing power of fiat currencies. Consequently, investors seek assets like gold that historically maintain their value against rising prices. Central Bank Policies: Many central banks globally are accumulating gold at a significant pace. This institutional demand provides a strong underlying support for the gold price. Furthermore, expectations around interest rate cuts in the future also make non-yielding assets like gold more attractive. These factors collectively paint a picture of a cautious market, where investors are looking for stability amidst a turbulent economic landscape. Understanding Gold’s Appeal in Today’s Market For centuries, gold has held a unique position in the financial world. Its latest record-breaking performance reinforces its status as a critical component of a diversified portfolio. Gold offers a tangible asset that is not subject to the same digital vulnerabilities or regulatory shifts that can impact cryptocurrencies. While digital assets offer exciting growth potential, gold provides a foundational stability that appeals to a broad spectrum of investors. Moreover, the finite supply of gold, much like Bitcoin’s capped supply, contributes to its perceived value. The current market environment, characterized by economic uncertainty and fluctuating currency values, only amplifies gold’s intrinsic benefits. It serves as a reliable hedge when other asset classes, including stocks and sometimes even crypto, face downward pressure. How Does This Record Gold Price Impact Investors? A soaring gold price naturally raises questions for investors. For those who already hold gold, this represents a significant validation of their investment strategy. For others, it might spark renewed interest in this ancient asset. Benefits for Investors: Portfolio Diversification: Gold often moves independently of other asset classes, offering crucial diversification benefits. Wealth Preservation: It acts as a robust store of value, protecting wealth against inflation and economic downturns. Liquidity: Gold markets are highly liquid, allowing for relatively easy buying and selling. Challenges and Considerations: Opportunity Cost: Investing in gold means capital is not allocated to potentially higher-growth assets like equities or certain cryptocurrencies. Volatility: While often seen as stable, gold prices can still experience significant fluctuations, as evidenced by its rapid ascent. Considering the current financial climate, understanding gold’s role can help refine your overall investment approach. Looking Ahead: The Future of the Gold Price What does the future hold for the gold price? While no one can predict market movements with absolute certainty, current trends and expert analyses offer some insights. Continued geopolitical instability and persistent inflationary pressures could sustain demand for gold. Furthermore, if global central banks continue their gold acquisition spree, this could provide a floor for prices. However, a significant easing of inflation or a de-escalation of global conflicts might reduce some of the immediate upward pressure. Investors should remain vigilant, observing global economic indicators and geopolitical developments closely. The ongoing dialogue between traditional finance and the emerging digital asset space also plays a role. As more investors become comfortable with both gold and cryptocurrencies, a nuanced understanding of how these assets complement each other will be crucial for navigating future market cycles. The recent surge in the gold price to a new record high of $3,704 per ounce underscores its enduring significance in the global financial landscape. It serves as a powerful reminder of gold’s role as a safe haven asset, a hedge against inflation, and a vital component for portfolio diversification. While digital assets continue to innovate and capture headlines, gold’s consistent performance during times of uncertainty highlights its timeless value. Whether you are a seasoned investor or new to the market, understanding the drivers behind gold’s ascent is crucial for making informed financial decisions in an ever-evolving world. Frequently Asked Questions (FAQs) Q1: What does a record-high gold price signify for the broader economy? A record-high gold price often indicates underlying economic uncertainty, inflation concerns, and geopolitical instability. Investors tend to flock to gold as a safe haven when they lose confidence in traditional currencies or other asset classes. Q2: How does gold compare to cryptocurrencies as a safe-haven asset? Both gold and some cryptocurrencies (like Bitcoin) are often considered safe havens. Gold has a centuries-long history of retaining value during crises, offering tangibility. Cryptocurrencies, while newer, offer decentralization and can be less susceptible to traditional financial system failures, but they also carry higher volatility and regulatory risks. Q3: Should I invest in gold now that its price is at a record high? Investing at a record high requires careful consideration. While the price might continue to climb due to ongoing market conditions, there’s also a risk of a correction. It’s crucial to assess your personal financial goals, risk tolerance, and consider diversifying your portfolio rather than putting all your capital into a single asset. Q4: What are the main factors that influence the gold price? The gold price is primarily influenced by global economic uncertainty, inflation rates, interest rate policies by central banks, the strength of the U.S. dollar, and geopolitical tensions. Demand from jewelers and industrial uses also play a role, but investment and central bank demand are often the biggest drivers. Q5: Is gold still a good hedge against inflation? Historically, gold has proven to be an effective hedge against inflation. When the purchasing power of fiat currencies declines, gold tends to hold its value or even increase, making it an attractive asset for preserving wealth during inflationary periods. To learn more about the latest crypto market trends, explore our article on key developments shaping Bitcoin’s price action. This post Unprecedented Surge: Gold Price Hits Astounding New Record High first appeared on BitcoinWorld.
Share
Coinstats2025/09/18 02:30
China Bans Nvidia’s RTX Pro 6000D Chip Amid AI Hardware Push

China Bans Nvidia’s RTX Pro 6000D Chip Amid AI Hardware Push

TLDR China instructs major firms to cancel orders for Nvidia’s RTX Pro 6000D chip. Nvidia shares drop 1.5% after China’s ban on key AI hardware. China accelerates development of domestic AI chips, reducing U.S. tech reliance. Crypto and AI sectors may seek alternatives due to limited Nvidia access in China. China has taken a bold [...] The post China Bans Nvidia’s RTX Pro 6000D Chip Amid AI Hardware Push appeared first on CoinCentral.
Share
Coincentral2025/09/18 01:09