Digital Democracy Dies in the Dark
Since the first time a computer connected to another computer in 1962 and formed the infantile connection that grew into what we know as the internet today (Leiner et al. 3), securing it has proven to be an unrelenting challenge. The objective of the internet was to share information, and though it has done that quite successfully, it has also created new vulnerable spaces accessible to the rest of the world. At no other point in history has companies, governments, individuals, and infrastructure been so easily accessible to the world at large. Yet, with three billion internet users and counting, only 42% of the population has access to the internet (Saleh). As more of the world's population enters the ether, so will grow the attackable surface available for exploitation. Preparing for this future begins at home and by preventing and addressing evolving weaknesses. Current critical cybersecurity issues amongst public cyberspace and externally facing networks in the U.S. include weak network security architecture, mismatched skills, and a lack of needed policy support and authority. With tireless efforts to connect the entire globe, increased traffic and population will exacerbate these issues.
The first issue to discuss is weak network security and architecture. Nationally we are only as strong as the weakest networks; those networks are built and maintained by fellow humans. Sharing Singer's concept that the human factor is often unavoidably the weakest link (65). Reaching complete security is a myth in general and especially in a liberal democracy. The varying degrees of cybersecurity across the public sector could be due to numerous reasons, such as a lack of prioritization due to financial constraints, resource constraints, or knowledge gaps regarding the importance of cyber, to name a few.
A contributing factor to this challenge is that the United States possesses and supports a digital democracy. Though beneficial to residents and inventions, the freedoms present can hinder nationwide security. Authoritarian countries like China have more flexibility and control in providing and enforcing public sector security. Because of this, digital authoritarianism is possible, which extends to include the enforcement of cybersecurity policies (Wright 2).
Government support that is accessible and available to assist either through free or discounted training programs eligible organizations could use to train their cybersecurity staff is a potential solution to addressing this issue. Alternatively, if incentives fail to motivate organizations, penalties and fees are an option for failing to meet state and federal security standards. For example, a similar structure exists currently with the Federal Information Security Management Act (FISMA) implemented by the U.S. Department of Defense. Any entity working with the Department of Defense that fails to meet the security standards established is in for steep penalties and fines (RSI Security) and risks termination of their contract or business agreement with the Department of Defense. Because we are speaking of the public sector, additionally, any individual who is required to share their private information with a company to conduct a business transaction should have the ability to learn if the company is compliant with the latest cybersecurity standards. The National Institute of Standards and Technology (NIST) set the initial framework for FISMA; perhaps an act could be passed for public-facing businesses. We'll call this the Public Information Security Act (PISA).
You wouldn't expect the public not to have issues with a banking institution failing to meet banking regulations. Similarly, with visiting a hospital that has been unable to meet the Health Insurance Portability And Accountability Act (HIPAA) requirements or any other safety and security-related requirements (Office for Civil Rights). Consumers would have the transparency to choose not to risk their information with a non-compliant company if that information became publicly available. Public scrutiny and pressure could also become a key motivators.
The next issue is the concept of ill-matched attacks. Imagine a team of nation-state offensive groups targeting a hospital; it would be like soldiers from another country fighting office workers. The average American cybersecurity team, if present, is typically not employed to be a cyber combatant spending their time fighting off nation states or criminals looking to make profits on the black market. Though defending against attacks is a clearly defined and essential role, sometimes it isn't enough. Staff isn't and shouldn't be spending their time conducting attacks on adversarial nations or engaging in any offensive behavior. Their time, energy, and resources are as they should be being spent on supporting, protecting, and building the security posture of the organization or business that has employed them. Not preparing for a cyber battle with a paperclip and a piece of gum which is the level of support some individuals find themselves with. It also only takes one vulnerability for adversaries to exploit where the defensive team has to prepare and defend against every possible scenario. As Anderson stated, "Even a very moderately resourced attacker can break anything that's at all large and complex" (5).
This issue has seemed to only strengthen over time, and I suspect this has to do with the fact that we cannot change the world. I think it is time to redefine every company's needs and consider what it would look like to train cybersecurity staff members to handle better the situations described above. It is typical to find front-line security members "watching the wire," but is that enough? Alerting and blocking attacks are only effective when the attack is detected and the network possesses the necessary tools and architecture required to block the attack. If the attack uses unknown methods or is leveraging a zero-day exploit, a previously never disclosed critical vulnerability; the target entity is wide open and at extremely high risk for compromise. For these reasons, the government could consider providing staff augmentation or other assistance programs with the authority and resources to address what skills and roles are necessary and enable the staff and organization to respond more aggressively—leading me to the following critical issue.
In reference to a previous prescription of fostering government support and aid for the public space, there is also an issue within this space of authority and autonomy regarding offensive measures that the U.S. government can authorize without the President's approval. Bolstering policy support and addressing the problem described by Leed regarding "insufficient legal or policy authority to permit the use of offensive cyber capabilities at the tactical level" (4) could indirectly benefit the public sector. The flexibility enabling trusted authorities to review and approve offensive operations could pave the way to a new front of defense. Adversarial states may be less inclined to attack public sector cyberspace, knowing the U.S. government can respond offensively, quickly, and effectively. The discrepancy and lag Leed implies hurts the image of the U.S.'s proficiency and command of local cyberspace. A solution to this issue would be for the President to appoint trusted cyber advisors who would have the authority to approve tactical offensive cyber operations. Distributing authority could help prevent a concentration of power into one role or individual and enable quicker response times to malicious activities.
These are just a few critical issues that equally play into each other. The obvious question is, "why can't we just build a different internet?" which may be a possible long-term solution but has not proven to provide short-term solutions. It will take cooperation, teamwork, prioritization, and a dose of reality to get in the shape we need. It's time to analyze and accept the validity of the situation that we, as a nation, are at war in the public sector and have inadvertently left some of our most critical defenders exposed and unsupported.
Anderson, Ross. Why Information Security is Hard -- An Economic Perspective, 17th Annual Computer Security Applications Conference (ACSAC'01), IEEE Computer Society, December, 2001.“Federal Information Security Management Act(FISMA) Implementation Project.” NIST, 19 Mar. 2018, www.nist.gov/programs-projects/federal-information-security-management-act-fisma-implementation- project.
Leed, Maren. “Offensive Cyber Capabilities at the Operational Level.” Center for Strategic and International Studies, www.csis.org/analysis/offensive-cyber-capabilities-operational-level. Accessed 1 Apr. 2022.
Leiner, Barry. et al. Internet Society. “Brief History of the Internet.” Internet Society, 22 Mar. 2022,www.internetsociety.org/resources/doc/2017/brief-history-internet.
Office for Civil Rights (OCR). “HIPAA for Professionals.” HHS.Gov, 16 Aug. 2021, www.hhs.gov/hipaa/for-professionals/index.html.
Saleh, Khalid. “How Much Of The World Population Is Online – Statistics And Trends.” Invesp, 11 Apr. 2018, www.invespcro.com/blog/world-population-online.
Security, Rsi. “Penalties for Non-Compliance with FISMA (and How to Avoid Them).” RSI Security, 20 Dec. 2018, blog.rsisecurity.com/penalties-for-non-compliance-with-fisma-and-how-to-avoid-them.
Singer, P., et al. Cybersecurity and Cyberwar: What Everyone Needs to Know. Oxford University Press, Incorporated, 2013.
Wright, Nicholas. “How Artificial Intelligence Will Reshape the Global Order: The Coming Competition Between Digital Authoritarianism and Liberal Democracy.” Foreign Affairs, 24 Nov. 2021, www.foreignaffairs.com/articles/world/2018-07-10/how-artificial-intelligence-will-reshape-global-order.