The Profit Playbook of Big Tech: A Slow Decay of Social Platforms
In 2022, the global digital advertising market reached a staggering $602 billion. Yet, this growing pie hasn’t translated into a better experience for users. Instead, it has accelerated the degradation of digital platforms, a phenomenon Cory Doctorow aptly termed “enshittification.” Think of platforms like Facebook, Instagram, or X (formerly Twitter): they entered our lives promising connectivity, fairness, and free access. Today, they are unrecognizable. Cluttered interfaces, ubiquitous ads, manipulated algorithms, endless data exploitation — the erosion is systemic and intentional.
Breaking from the Innocence of Early Internet Platforms
This is not merely a case of declining functionality; it’s a strategy. Platforms adopt a three-stage approach: first, they woo users with features that seem too good to be true. Think Facebook’s ad-free, community-focused charm in its early years. Then, they pivot to businesses, turning users into a commoditized audience base for advertisers. By the final stage, both users and businesses are leveraged to maximize profit extraction — often at the cost of trust, authenticity, and usability.
Case in point: Instagram, once a platform for personal expression, now bombards its users with algorithmically-determined ads and “recommended content” they never opted for. Similarly, YouTube has weaponized unskippable ads — effectively forcing many users into its premium subscription model. And X? The sale of blue tick verification marks, allowing unqualified individuals to simulate credibility, has completely undermined the platform’s integrity as a tool for authentic, public discourse.
Contrast this with the internet’s original ethos. The early web focused on user empowerment and open interfaces. What we see now is a shift into walled gardens — platforms designed to trap users for monetization rather than serve their needs. This isn’t just a case of poor design or unintended consequences; it’s a calculated strategy baked into Big Tech’s business model.
The Regulatory Machinery: Lofty Claims or Genuine Accountability?
The Indian government is not oblivious to these patterns of exploitation. Over the past two years, New Delhi has introduced a suite of legislations and policy interventions aimed at addressing the monopoly power and predatory practices of Big Tech. The Digital Personal Data Protection Act, 2023, for instance, mandates consent-based data collection and imposes penalties for violations, while the Competition (Amendment) Act, 2023 enhances the Competition Commission of India’s (CCI) authority to target anti-competitive behaviour.
However, there is an undeniable gap between these instruments’ intent and execution. Take the proposed Digital Competition Bill (currently in draft form). It promises to crack down on practices like self-preferencing and gatekeeping by platform monopolies. Yet, it is likely to face resistance not only from tech giants but also from bureaucratic inertia. The Indian government’s track record on implementing tech-related guidelines, as seen with the IT Intermediary Rules, 2021, has been uneven at best.
Moreover, the open-ended language of several provisions — such as what constitutes "excessive" data collection or "manipulative algorithms" — leaves much room for procedural ambiguity. Without clarity, enforcement could end up as either toothless or excessively punitive, risking a chilling effect on legitimate digital innovation.
What Official Claims Miss: The Story the Data Tells
The rhetoric of regulation often paints a comforting picture for users. Yet, ground realities expose the cracks. According to a 2024 report by the Internet and Mobile Association of India, 92% of users reported encountering excessive ads on at least one major platform. Meanwhile, a survey conducted by LocalCircles revealed that 67% of Indian users felt their data was being used without adequate consent, even after the passage of the DPDP Act.
It is also crucial to spotlight the digital divide exacerbated by enshittification. Platforms increasingly reserve premium, ad-free experiences for paying users. This means that while urban, affluent users may afford YouTube Premium or ad blockers, economically disadvantaged sections remain at the mercy of manipulative algorithms and aggressive tracking. The rhetoric of “free access” rings hollow when the quality of such access depends on socio-economic privilege.
Notably, this model of degradation is unsustainable even for the platforms themselves. User fatigue is real. A study by Deloitte in late 2023 found that nearly 24% of Indian users planned to leave or reduce their interaction with at least one major digital platform in the next 12 months, citing declining trust and usability as key reasons. Without course correction, digital platforms risk becoming victims of their own exploitative designs.
The Embrace of Alternatives: A Tale of Two Internets
When South Korea found its digital ecosystem dominated by a handful of monopolistic platforms in the late 2010s, it adopted bold measures. The government’s Fairness in Online Platforms Act (2020) prohibits unfair trade practices and mandates algorithmic transparency for all major digital players. But it went one step further — actively funding open-source and interoperable digital solutions, ensuring that small businesses and users could avoid dependency on Big Tech.
In comparison, India’s own Open Network for Digital Commerce (ONDC) is a crucial first step but faces immense scaling challenges. To truly emulate South Korea’s success, New Delhi must align infrastructural investments with regulatory diligence, ensuring alternatives are not just theoretical but practical and accessible.
Discomforting Questions That Remain Unasked
One of the core failures of India’s response to digital enshittification lies in the lack of participation and consultation. Who gets to decide what counts as “exploitative” or where the line between personalization and manipulation lies? Despite the passage of ambitious laws, algorithmic transparency is still more buzzword than reality, with platforms retaining significant opacity around how decisions are made.
Another glaring issue is state-level capacity. Digital literacy across India remains wildly uneven, with states like Kerala far ahead of, say, Bihar. Implementation of consumer protection mechanisms varies similarly. How can a national framework address such disparities without a granular, localized strategy?
Finally, the political economy of regulating global platforms raises tricky questions. India’s willingness to confront Big Tech mirrors global trends but risks retaliatory economic consequences, such as reduced investments or protracted legal battles. Can uniform regulation effectively cater to global platforms without stifling their contributions to India’s burgeoning digital economy?
Prelims Practice Questions
Practice Questions for UPSC
Prelims Practice Questions
- Statement 1: The Digital Personal Data Protection Act, 2023 has been implemented without any gaps.
- Statement 2: Ambiguities in legislation may hinder effective enforcement against Big Tech.
- Statement 3: Regulatory measures are uniformly accepted by all tech companies.
Which of the above statements is/are correct?
- Statement 1: They focus solely on enhancing user experience.
- Statement 2: They exploit user data to monetize their services.
- Statement 3: They consistently maintain transparency with their users.
Which of the above statements is/are correct?
Frequently Asked Questions
What does the term 'enshittification' refer to in the context of social platforms?
'Enshittification' refers to the systemic degradation of social platforms, characterized by cluttered interfaces, ubiquitous ads, and manipulated algorithms. This phenomenon indicates a deliberate strategy by tech companies to prioritize profit at the expense of user experience and trust.
How do social platforms typically evolve from user-friendly to profit-driven models?
Social platforms usually follow a three-stage strategy: they first attract users with appealing features, then shift focus to monetizing their user base by catering to advertisers, and finally leverage both users and advertisers to maximize profits, often compromising usability and integrity in the process.
What is the significance of the Digital Personal Data Protection Act, 2023?
The Digital Personal Data Protection Act, 2023 is significant as it aims to enhance user rights and data protection through consent-based data collection and penalties for non-compliance. However, gaps in implementation and interpretation may challenge its effectiveness in combating the exploitative practices of digital platforms.
What challenges does the Indian government face in regulating Big Tech?
The Indian government faces various challenges in regulating Big Tech, including bureaucratic inertia and resistance from the tech giants themselves. Furthermore, ambiguous language in legislation regarding issues like 'manipulative algorithms' complicates enforcement, risking ineffective or overly punitive measures that could stifle innovation.
How does socioeconomic privilege affect user experiences on digital platforms?
Socioeconomic privilege plays a crucial role in user experiences on digital platforms, as those with financial means can afford premium services that offer ad-free experiences. In contrast, economically disadvantaged users often face exploitative practices and manipulative algorithms, highlighting the inequity in access and quality of digital services.
About LearnPro Editorial Standards
LearnPro editorial content is researched and reviewed by subject matter experts with backgrounds in civil services preparation. Our articles draw from official government sources, NCERT textbooks, standard reference materials, and reputed publications including The Hindu, Indian Express, and PIB.
Content is regularly updated to reflect the latest syllabus changes, exam patterns, and current developments. For corrections or feedback, contact us at admin@learnpro.in.