
HI6035 IS Governance and Risk Case Study Sample
Assessment Task
This assignment challenges you to critically evaluate an IS Governance framework and propose a risk management strategy for a given case study. Your goal is to demonstrate how effective governance and risk management can support the organization's overall strategy and operational objectives.
Objectives:
1. To enhance your ability to analyse governance frameworks within an organizational context.
2. To develop practical risk management strategies using industry- standard methodologies.
3. To critically evaluate the impact of governance and risk management on information systems operations.
Details:
Case Study Analysis: You can choose a case study that describes a business scenario involving governance and risk issues from last 5 years Journals or conferences paper. Carefully read and analyse the case to understand the underlying governance challenges and risks.
Governance Framework Assessment:
1. Framework Selection: Select an appropriate IS Governance framework (e.g., COBIT, ITIL, ISO/IEC 38500).
2. Analysis: Critically assess the selected framework’s ability to address the governance challenges identified in the case study.
3. Implementation Plan: Propose a plan for implementing the framework within the organization, including key roles, responsibilities, and processes.
Risk Management Strategy:
1. Risk Identification: Identify the key risks associated with the IS operations described in the case study.
2. Risk Assessment: Evaluate the likelihood and impact of these risks using a recognized risk assessment methodology.
3. Mitigation Plan: Develop a risk mitigation plan that includes preventive, detective, and corrective controls.
Expected Outcomes: Discuss the expected outcomes of implementing your governance and risk management strategies, including potential benefits, challenges, and impacts on organizational performance.
Key Elements of the Assessment:
1. Introduction: Provide an introduction or background to the governance and risk issues, including an identification of gaps in the current approach.
2. Governance Framework: Analyze the selected governance framework, including its strengths, weaknesses, and applicability to the case study.
3. Risk Management: Present a comprehensive risk management strategy, detailing the identification, assessment, and mitigation ofrisks.
4. Implementation: Discuss the practical steps for implementing your governance and risk management strategies within the organization.
5. Conclusion: Summarize your findings and suggest areas for further improvement or research.
List of references
This must be provided in the usual scholarly fashion. It helps to convince your reader that your proposal is worth pursuing if you can identify literature in the field and demonstrate that you understand it. It makes a very strong impact if you can identify where there is a research gap in the literature that your proposal hopes to fill. This is your contribution to the scholarly conversation. You should use academic references (peer reviewed articles), rather than web articles.
Solution
1. Introduction
In 2018, the Facebook–Cambridge Analytica scandal highlighted a major turning point in the conversation surrounding data privacy, governance, and the ethicalities of use in our digital world. This revealed how the political consulting firm, Cambridge Analytica, had harvested personal data from at least 50 million Facebook accounts without their owners’ permission. Robert Mercer, who was pivotal in creating Cambridge Analytica and reportedly played a key role in the company's ability to target specific voter demographics, was a major donor to multiple campaigns that sought to foment pro-Brexit sentiment, as well as the 2016 U.S. presidential election. The disclosures illuminated not just grave governance failures at Facebook but also exposed systemic weaknesses in how you manage and protect user data (González-Pizarro et al., 2022). One of the many scandal’s greatest revelations was the abuse of a relatively harmless personality quiz app, which leveraged Facebook’s infrastructure to garner data from hundreds of thousands of users and their hapless friends of those users.
Facebook’s API policy meant that data could be collected on a massive scale, calling into question whether the company had adequate governance frameworks, oversight mechanisms, and ethical standards in place. The fallout from the scandal included intense scrutiny from regulators, lawmakers, and the public of Facebook, as well as regulatory fines, a congressional hearing into Facebook’s practices, and damage to Facebook’s reputation. Outside of Facebook, the scandal was an important alarm bell for the tech sector and politicians about the pressing necessity for strong data privacy restrictions, along with corporate accountability on the use of sensitive user data (Jeleskovic and Wan, 2024).
2. Literature Review
2.1 Governance framework analysis
Governance Framework: GDPR
Figure 1: GDPR
Source: (plane, 2025)
One of the most significant data protection regulations designed to provide users with increased data privacy and to regulate the handling of personal data strictly is the General Data Protection Regulation (GDPR), which was created in 2018 by the European Union. The EU GDPR’s most relevant aspects, in the context of the Facebook-Cambridge Analytica scandal, are its emphasis on specific consent mechanisms and its emphasis on accountability and transparency as mitigation actions that could have reduced the risk of the type of data misuse involved in this case, had they been in place beforehand (Machova, 2021).
Strengths
These include data minimization, purpose limitation, and explicit consent, providing a well-structured framework for data protection under GDPR. It requires organizations to designate Data Protection Officers (DPOs) to ensure compliance and perform regular impact assessments for university assignment help These characteristics give the framework considerable enforcement capacity, with high financial penalties for failure to comply driving meaningful incentives to comply (Fernandez, 2022).
Weaknesses
Its breadth, however, means that implementing and enforcing GDPR can be tricky, given the global reach of many large corporations that cross boundaries. Relying on individual data subjects to report violations can create a bottleneck, and compliance can be costly and complex for smaller organizations. Also, the regulation’s focus on explicit consent can conflict with user experience goals and lead to the overconsumption of consent requests (Piranda, Sinaga, and Putri, 2022).
Relevance to the Case Study
Had Facebook embraced GDPR principles in its governance, Cambridge Analytica wouldn’t have had the opportunity to misuse data. GDPR’s focus on explicit consent would have ensured that users were fully informed about whether they were being tracked, and in what ways. Regular audits and accountability measures, as required under GDPR, could have caught and undone the exploitative practices made possible by Facebook’s API policies. The fact that it was applied in the case, although GDPR alone won't necessarily stop all instances of data misuse, still shows just how valuable and even necessary regulatory oversight can be to protecting user privacy and maintaining trust in digital platforms (Kraus et al., 2022).
2.2 Risk Management Strategy
Identification of Risks
Identifying the Risks of Data Privacy and Security to Get Better Risk Management Data access, third-party exploitation, weak API control, and low user awareness are examples of risk scenarios that Facebook is concerned about. Regular audits and vulnerability assessments should be carried out to identify systemic weaknesses and potential points of exploitation (Piranda, Sinaga, and Putri, 2022).
Assessment of Risks
The following step would be needed for the attenuation of risks by being determined by each one of them by the probability of actual occurrence vs the implication. Third-party developers were a huge risk for Facebook, as learned in the Cambridge Analytica scandal. Risk matrices and scoring systems also can be used to help prioritize areas of priority risk, such as third-party access permissions or weak oversight of data-sharing practices (Asimovic et al., 2021).
Mitigation of Risks
Facebook needs to devise strong risk mitigation strategies to address identified risks. Tightening API policies that govern data access, rigorously enforcing third-party agreements, and reinforcing user consent mechanisms are important steps. Audit logs and sensors can also help detect breaches early on, disabling access before damage is done. Training employees regularly on the importance of data ethics and privacy standards is crucial to creating a culture of compliance and awareness (Susannah et al., 2023).
Monitoring and Adjusting Continuously
Risk management is an ongoing process, not a single event. They must develop data governance frameworks that address relevant regional jurisdictions similar to how they focus on EU and US policy divisions, monitoring compliance risk, updating policies to mitigate the threats, and ensuring alignment with data protection requirements globally. Seamless outcome: By taking a proactive and dynamic approach to risk management, organizations can strengthen their resilience against data privacy violations and preserve user trust (Akgül and Uymaz, 2022).
2.3 Implementation
.png)
Source: (coref, 2023)
Setting Up Governance Policies
The first important step is to establish and enforce strong governance policies, based on the principles of GDPR and related regulations. This involves stipulating clear roles and responsibilities for protecting data, forming a data governance committee, and drafting guidelines for data collection, usage, and sharing (Susannah et al., 2023).
Enhancing Consent Mechanisms
Hence, to abide by such laws, organizations must create effective and friendly consent mechanisms that are easy for individuals to comprehends. And that is to inform users for what purposes their data will be used, as well as to obtain their explicit, informed consent to collect and process that data (Piranda, Sinaga and Putri, 2022).
Tightening API and Third-Party Controls
There is a need to review API policies and strengthen those policies to minimize access of third-party developers to the data. These include access control, a defined third-party audit, and written contracts detailing usage restrictions and the consequences of abuse (Susannah et al., 2023).
Technology is becoming increasingly important in solving site issues
Investing in these technologies—such as encryption methods, intrusion detection systems, and AI-powered monitoring tools—is imperative. Conduct periodical training sessions for employees that cover data privacy standards and also ethical practices surrounding data collection and usage in the organization (Asimovic et al, 2021).
Conducting Regular Audits
Periodic audits of data management practices can help in ensuring that any deficiencies in compliance are identified and corrected. Such audits may include the status of data security, vendor contracts, and effectiveness of consent mechanisms (Machova, 2021).
Honor the Opportunity- Build a Culture of accountability
Make It Accountable: Encourage transparency, empower employees to speak up and hold leadership accountable for privacy practices. Articulated escalation protocols help further strengthen accountability, while making data privacy a board-level priority does too (Kraus et al., 2022).
Regulatory Oversight and Policy Adjustment
This requires the ability to continuously assess and adjust governance and risk management tactics to ensure they are effective with a rapidly evolving risk landscape and regulatory environment. This needs constant tracking of technological developments, periodic risk assessment, and continuous updating of policies to cope with fast-evolving digital landscape (Fernandez, 2022).
3. Methodology
3.1 Approach to analyzing the governance framework and risk management strategy
Approach to Governance Framework Analysis
The Facebook–Cambridge Analytica scandal marks one of the most flagrant failures of governance to ever rock the political world and one that raised serious questions about data privacy and protection. The failures of governance can be viewed through two lenses: one of compliance with data protection regulations (e.g. GDPR) and one of risk management to improve data governance and prevent misuse of user data (Asimovic et al., 2021).
GDPR Compliance
In 2018 the European Union introduced the General Data Protection Regulation (GDPR). Put it in the context of Facebook and Cambridge Analytica the company didn't meet even the base level of GDPR compliance. Key aspects include:
• Failure to Properly Inform Users: Facebook failed to properly notify users regarding how their information was being collected and used by third-party applications. GDPR requires that the consent of users be explicit and unequivocal before their information can be shared.
• The ability to access and correct one's data: Organizations must comply with the GDPR and allow for the correction, deletion, and access of personal information. Facebook’s errant policy of not safeguarding sensitive user data against third-party access without sufficient oversight, thus, violated these principles.
• Data Security and Breach Notification: Under GDPR, any breach of personal data must be notified within 72 hours. Study talks about a delay in transparency, and a delay in accountability because Facebook did not openly disclose the breach immediately (Fernandez, 2022).
Approach to Analyzing Risk Management Strategy
• Data Control Access: Ensuring that access to personal data is strictly regulated through impressive data-sharing policies and continuous audits of data-sharing agreements to restrict third-party applications and developers.
• Transparency and User Control: Incorporating intuitive design for consent mechanisms and keep users informed about the data usage lifecycle, with one-click access revocation at any point.
• Continuous Training and Awareness: Educating employees and contractors on data privacy principles as well as the potential consequences of violations to minimize the risk of accidental breaches or misuse.
• Proactive Monitor (means some monitoring of data use) and Incident Response: There should be a robust system for ongoing monitoring of data use and prompt response to data breach incidents to ensure compliance with the requirements under GDPR and other privacy laws (González-Pizarro et al., 2022).
4. Analysis and findings
4.1 Analysis
Governance Framework Analysis: GDPR and the Cambridge Analytica Scandal
The Facebook–Cambridge Analytica scandal highlighted major governance failures underlying Facebook’s management of user data, which enabled massive-scale violations of privacy. The scandal broadly highlights how some of the core principles of the General Data Protection Regulation (GDPR) explicit consent, transparency, and more could have been effective in preventing the misuse of user data. Facebook, however, failed to comply with the GDPR’s robust requirements for data consent and accountability, making it possible for third-party developers such as Cambridge Analytica to access user data without necessarily obtaining explicit consent. Had Facebook complied with GDPR’s consent provisions and routinely reviewed the data it shared with third parties, the magnitude of the breach might have been mitigated. The regulation’s enforcement ability, removing financial hoops for those who failed to comply, would have forced a lot better governance (Jeleskovic and Wan, 2024).
Risk Management Strategy
Social media companies must align their businesses to comply with risk management strategies, which the Cambridge Analytica case also revealed to be flawed. Weak API controls and lack of awareness were not the only reasons behind the Facebook scandal due to third-party applications. A good risk management application would include identification and assessment of risks to data privacy, prioritization based on possible impact, and remediation through actions like strengthening API policies and extending data handling agreements with third parties. Therefore, ongoing monitoring and periodic audits are essential to respond to emerging threats and maintain compliance with data protection legislation (González-Pizarro et al., 2022).
4.2 Findings
Data Breach: The Governance Failures and Data Privacy Risks
The Facebook–Cambridge Analytica scandal highlighted serious governance failures in the data privacy practices of Facebook. The platform’s inability to effectively regulate what third parties had access to users’ data, especially via its API opened it up to mass abuse of people’s personal information. This breach also pointed a finger at the lack of transparency about how third-party apps collect, use, and share data with founding third-party developers. Moreover, its terms and conditions did not adequately mitigate risks from outside developers, allowing them to draw down, effectively scraping Facebook for data they had no right to use. Weak oversight and lack of routine audits meant that exploitative habits went unnoticed and unaddressed for too long (Machova, 2021).
Results
If GDPR had been fully enacted via Facebook’s governance framework, then many if not most of the issues that precipitated the scandal would have been avoided. It was also regulated that third parties had strict restrictions on the access and utilization of the user's data, with the GDPR insisting upon well defined and clear explanations on how the user's data would be used. Minimal risk management practices, such as periodic attempts to identify vulnerabilities, and persistent oversight of third-party activities, exacerbate the situation. Effective management of this risk would have entailed restricting access to the data and implementing more user consent mechanisms to avert abuse of data use at such colossal means (Kraus et al., 2022).
5. Conclusion
The Facebook–Cambridge Analytica affair is a textbook example of why data privacy and risk management either need to be co-integrated at the technical level, or be independently governed a level up from the technical layer. Weak governance frameworks and lack of oversight mechanisms can lead to misuse of user data on a large scale, undermining public trust and posing difficult ethical and legal questions, as was made clear in this scandal.problem. Facebook’s failure to adequately control access to user information by outside entities combined with a lack of transparency about its practices in terms of the data it shares created the conditions for an unprecedented breach of personal information. This scandal shows that every country must have comprehensive anti-abuse data protection laws, such as the European Union's General Data Protection Regulation (GDPR), to prevent such violations.
If Facebook had applied GDPR’s principles, especially its requirements for explicit consent and accountability, it could have massively reduced the risk of misuse. This regulation would have explicitly focused on measures like increased transparency, periodic audits, and assigning data protection officers, all which could arguably provide them with better visibility and compliance mechanisms, finding out sooner about the data misuse. Additionally, a loss prevention phase of risk management could had perhaps aided, manifesting in the form of defensive vulnerability assessments, tighter access controls to the APIs their platform exposes in addition to sealing anticipatory third party contracts detailing threats an external party like Cambridge Analytica poses via their platform infrastructure.
Organizations are now being forced to rethink how to truly protect user data. These moves are critical to companies that seek to regain trust and make sure that personal data is not misused. Especially in emergency situations, it is extremely important to protect data privacy and to treat personal information carefully. When the study pushes further into the cross-platform world, then platform operators should embrace user protections, comply with global regulations and be hyper-vigilant against the ever-evolving threat to data privacy.
References
Akgül, Y. and Uymaz, A.O. (2022). Facebook/Meta usage in higher education: A deep learning-based dual-stage SEM-ANN analysis. Education and Information Technologies. doi:https://doi.org/10.1007/s10639-022-11012-9.
Asimovic, N., Nagler, J., Bonneau, R. and Tucker, J.A. (2021). Testing the effects of Facebook usage in an ethnically polarized setting. Proceedings of the National Academy of Sciences, 118(25), p.e2022819118. doi:https://doi.org/10.1073/pnas.2022819118.
coref (2023). [online] Corefactors.in. Available at: https://www.corefactors.in/blog/content/images/2023/05/FB-Marketing-strategies-1.png [Accessed 22 Jan. 2025].
Fernandez, P. (2022). Facebook, Meta, the metaverse and libraries. Library Hi Tech News, 39(4), pp.1–5. doi:https://doi.org/10.1108/lhtn-03-2022-0037.
González-Pizarro, F., Figueroa, A., López, C. and Aragon, C. (2022). Regional Differences in Information Privacy Concerns After the Facebook-Cambridge Analytica Data Scandal. Computer Supported Cooperative Work (CSCW), 31. doi:https://doi.org/10.1007/s10606-021-09422-3.
Jeleskovic, V. and Wan, Y. (2024). The impact of Facebook-Cambridge Analytica data scandal on the USA tech stock market: An event study based on clustering method. [online] arXiv.org. doi:https://doi.org/10.48550/arXiv.2402.14206.
Kraus, S., Kanbach, D.K., Krysta, P.M., Steinhoff, M.M. and Tomini, N. (2022). Facebook and the creation of the metaverse: radical business model innovation or incremental transformation? International Journal of Entrepreneurial Behavior & Research, [online] 28(9), pp.52–77. doi:https://doi.org/10.1108/ijebr-12-2021-0984.
Machova, T. (2021). The discourse of surveillance and privacy: biopower and panopticon in the Facebook-Cambridge Analytica scandal. [online] DIVA. Available at: https://www.diva-portal.org/smash/record.jsf?pid=diva2:1578905.
Piranda, D.R., Sinaga, D.Z. and Putri, E.E. (2022). ONLINE MARKETING STRATEGY IN FACEBOOK MARKETPLACE AS A DIGITAL MARKETING TOOL. JOURNAL OF HUMANITIES, SOCIAL SCIENCES AND BUSINESS (JHSSB), 1(3), pp.1–8.
plane (2025). [online] Planetcompliance.com. Available at: https://www.planetcompliance.com/wp-content/uploads/2024/04/Figure-3-1-GDPR-1024x988.png [Accessed 22 Jan. 2025].
Susannah, Johns, M., Murauskaite, E.E., Golonka, E.M., Pandža, N.B., C. Anton Rytting, Buntain, C. and Ellis, D. (2023). Emotional content and sharing on Facebook: A theory cage match. Science Advances, 9(39). doi:https://doi.org/10.1126/sciadv.ade9231.