Background
Discuss about the Conference on Computer and Communication Security Paper.
A Review of a 2017 AMC Conference on Computer and Communication Security Paper: Let’s Go in for a Closer Look: Observing Passwords in Their Natural Habitat.
With text passwords increasingly becoming omnipresent in our present day world that is characterized by massive data breaches and information systems compromise, specifically when data falls in hands, malicious-intended people, the world security researchers community has been in a never-ending journey, conducting researches, creating awareness, and trying to help and coerce the general public to create password that attackers will find hard to crack, but will remain easy for the legitimate users- to remember (Anne & Martina, 1999).
Reports from text password scalability and usability researches have over the years indicated that no any users want to fall prey to the attackers, and in fact, they always have in good faith (Cynthia, et al., 2006), and in the security interests of their information, tried with no success to follow the existing password creation and management guidelines (Cormac, 2009).
This paper, presented to the AMC conference on Computer and Communication Security held in Dallas TX, USA in October 2017 by Sarah Peraman et al is a masterpiece of its own kind position in the today’s world of security. The team takes its audience through the very many challenges that the public faces when trying to create strong passwords as per the password management guidelines (US-CERT, 2016). As a way of protecting users’ accounts from looming attacks, the guidelines require the creation of passwords that are not less than eight characters and with no guessable words and/or characters (Sarah, et al., 2017). Additionally, users are advised to create passwords that are distinct for all of their accounts, all of which should be randomly chosen. The main challenge is however posed by the many accounts that present-day internet users have further fuelled by password complexity requirement thus making any efforts to an efficient password management practice an unrealistic demand for the human memory- one cannot create and memorize the distinct (effective) passwords for all their accounts.
As a research paper, the team went an extra mile to study password management practices that lead to the design of secure systems and systems’ interfaces through a deeper look into the extent in which internet users use passwords over the internet in their daily activities and the manner in which they reuse them (Security Behavior Observing), unlike the traditional researchers that have always been focused, limited to and dependent on indirect measurements, and surveys in reports. To enhance the originality, authentication, accuracy and validity of the project, data and results, the project team used a longitudinal and a comprehensive study, a process in which they carefully examined passwords and password related behaviors, approval and authentication from an ethics review board, and computer related information such as privacy and security of web-based extensions. Furthermore, in order to understand how users create guessing attack-resistant passwords, the team installed a neural network on users’ machines so as to compute and record strengths of such passwords (William, et al., 2016).
Password Management Practices
From their study, it was realized that a greater population of the participants found it terribly hard to create and recall different passwords, for different account and domains (Rick, et al., 2016). As such, they resorted to partial and exact password re-use (Dinei & Cormac, 2007). While previous studies had focused on partial re-use only, this paper brought the world of computer security to the knowledge of exact password reuse hitting to a 12% of the total participants. They further argued that many are the people and internet users who have clusters of password reuse- partially or exactly, a condition that in a very probable way predisposes them to security threats across all boards. Amazingly, some confessed to having multiple accounts that shared the same passwords!
While the team noted that stronger had passwords had a low probability of being reused, it was evident that such passwords were only created for accounts that respective users thought to be most important to them and were found to be not only satisfactorily long but also contained a mixture of special characters, alphabets and numerical digits. On the other side of the flip, any short passwords with special characters and digits are likely to be reused most, thanks to the users perception that such password is more secure, little do they know that their security status is easily compromised at such situations especially when the passwords are placed in an easily guessable manner and/or locations. In addition to this, the report further pointed to the high percentage of password reuse on government websites and related systems- in a pretext that these systems have stricter password policies and that they are more secure, least do we understand that these sites are the most vulnerable, and a greater target.
On password autofill functionality, the paper suggests that 19% of the participants reported having installed password managers but could not ascertain whether the password autofill was native browser-based or were done by the third-party password managers. Additionally, the study did not ascertain for the population accessing password managers from their mobile phones.
The team further argued that if by any chance we happen to be using password managers than changes must be enacted in these applications if at all we are in need of hard and strong passwords. This would, in turn, create a scenario in which the world consistently use strong passwords without any need of reusing them. This explanation only creates a basic need for a further analysis and stud to help determine whether it is native web-browsers doing the password autofill or it is the password managers. Secondly, an understanding of if these platforms are effectively serving the masses, to the extent of relieving them of the much-needed memorization of the much emphasized strong password is highly needed, and it will only be fair if future studies are focused on these subjects.
Password Reuse
In my own capacity, I find the ideas presented in this research paper important and equally educating. This is so because unauthorized access has become a major global concern with victims of such experiencing untold levels of data breach, compromise and privacy-related issues including theft of their money right away from their bank accounts and online payments systems. These dire effects create calls for our attention to creating an awareness regarding password usage and device mechanisms that will help manage our passwords and accesses to critical information, more efficiently and effectively. The report by the team coincides with the requirements for password security- long and complex, hard to guess and crack, a combination of special characters and digits, not reused and should not be written down on pieces of papers or in any printable form (Anna, n.d.).
The audience in this research work is all of us, anybody that uses high-tech devices such as laptops, smartphones, tablets, music players, desktops, information systems, consumers of such systems as Customer Relationship Management (CRM) systems and Enterprise Resource Planning (ERP) systems among many other system which require us to authenticate ourselves by providing our personal details, and in which we even save our personal information.
The acceptance of this paper in the high-level conference was in order and highly commendable. The world has to know where the rain started beating us, and increase its pace in the arms race in the provision and guaranteeing of security as far as passwords are concerned. Equally, all system, users are given a wakeup call and brought to the realization of the many practices that in our own perception do nor predispose us to security threats, but in the real sense do. In addition to this, it is a call to the security researchers to increase capabilities of password management systems so that they will in future relief their users the burden of creating and memorizing passwords.
The IT security domain is being positively changed and reshaped by ‘trusted computing’ practices as coined and developed by the Trusted Computing Group (TCG). According to the group, an entity is technically trusted if, for any intended purpose, it behaves normally or as expected (Trusted Computing Group, n.d.). The technology involves embedding of a permanent microcontroller security chip to the motherboard of most PCs being shipped today, called the trusted platform module (TPM), and which is used during the boot processes in computers as a way of establishing trust level and gathering measurements when launching applications’ environments. Once collected, the measurements are recorded in the TPM’s Platform Configuration Register (PCR) in a computationally infeasible manner, hence making it hard for any forgeries to be implemented on them (Sansar Choinyambuu, 2011).
Password Autofill Functionality
The idea of trusted computing is anchored on two concepts- static and dynamic root of trust measurement. The Static Root of Management (SRTM) refers to an immutable piece of code that is fixed in the system BIOS (Basic Input Output System). The code is always loaded at the start of an entire booting process with each code undergoing a process of check and measurement by a predecessor code before it is executed. This ensures that any code is measured and verified before it is executed, thus revealing any compromises eminent in the code and giving system users a guarantee of load time protection (Trusted Boot (tboot), 2010). However, the same piece of code is reported to have no functionalities and capabilities that guarantee security and protection after launch time, thus predisposing computer user security risks through the BIOS system.
The TPM 1.1 specifications adopted in 2003 soon after the Trusted Computing group was founded have gradually evolved, like any software, to meet the increasing demands for security and to be technologically coherent (Trusted Computing Group, 2012). This development only has seen the group roll out specifications 1.2 (2004) and the latest version 2.0. Typically, TPM modules are designed to offer two main services- either as Root of Trust for Security through the Storage Root Key that encrypts other keys thus creating a hierarchy of keys, and consequently encrypting user data using highly protected keys; or by providing a Root of Trust Reporting (RTR) as a way of guarantying tamper-proof measurement storage and reporting functions are not compromised.
In 2013, a group of four security researchers submitted to the BlackHat Inc. a paper they titled ‘Problems with the Static Root of Trust for Measurement’ in which they focused their energy on evaluating the current and historic protection of BIOs against reflashing- the general protection against SRTM, the implementation of existing latitude E6400 SRTM, and hoe PCR values re set together with the deviations that occurs off the TPM PC client spec, described the implementation of a parasite that is known to forge BIOS system status- called tick, and that of a flee- a reflash hopping BIOS parasite thus proving why enforced signed updates may not necessarily and adequate to protect our current systems.
The originality of this paper is that the group focused exclusively on the Static Root of Trust Management, a trust root that though it depends on the TPM, it doesn’t in reality reside in it (TPM) but in the BIOS and is not used for on-demand run-time measurements but rather helps to achieve a trusted boot. The tragedy is, however, that the contents of the STRM could be easily altered and thus jeopardizing the whole system and the whole chain of security/protection broken and corrupted.
Importance of Password Managers
Taking us through this complicated and very important journey of understanding the BIOS, while trying to avert any leads to ambiguity like has been caused in prior writings about flexibility of the TPM PC client spec specifications, the team investigates and carefully document their findings on BIOs security mechanisms and give a way forward, perhaps by helping us understand components of the BIOS and the whole system that can be adversely affected by not measuring such components and environments that require change detection, thus leading to compromised security.
The NIST (National Institute of Standard and Technology) special publication 800-155 of 2011 provides a more detailed description of what should constitute TPM PC client specification measurement as a way of providing reliable and sufficient SRTM (Regenscheid & Scarfone, 2011). The team, however, record a particular SRTM code that was found to not adhere to these guidelines, going way too far to breach some of the very clear recommendations by TPM PC client spec. this prompted their study and research in an attempt to blow the whistle in the whole subject matter, explaining to the world, PC designers and consumers why they should be cautionary about existing SRTM implementations and stressing on the need to follow the requirement specification as detailed by the NIST publication of 2011.
For a successful analysis of the SRTM, a BIOS firmware image must be obtained from the system so as to give an identity of where, and an understanding of how the SRTM chip is installed. This process could be achieved using three ways as described. The first way involves disordering the chip from its main board and then dumping its image into a binary file by use of an EEPROM (electrically erasable programmable read-only memory) device. Second, the way to obtain this image is by using the kernel driver that is costumed to read the firmware image from the flash chip and directly write it to a binary file and lastly by extracting and decoding the information from the always vendor-provided BIOS update files. Either of these methods is enough and reliable to obtain the binary information about the SRTM and can be statistically analyzed using some special software such as IDA Pro (John, et al., 2017).
Their investigations revealed that there were some systematic flaws and weaknesses with the chip. The E6400 BIOS that was used in the analysis, it has to be understood that it is based on Dell’s legacy BIOs core, and is a representative of some legacy BIOS that do not meet the present-day standard requirements thus making them more vulnerable and susceptible to threats and modifications since they are readily accessible. The assumption that recent SRTM system includes a higher degree of bug fixes, updates and protections is an unreliable assumption given the functionality that the BIOS system serves in our machines.
Conclusion
And if by any chance the SRTM chip is overwritten on, then its core functionality of trust measurement is severely undermined. The immutability feature requirement of the module is simply compromised- meaning that trust of the host platform is eroded. The team further went to analyze and bring to our attention the inaccuracy status of some chip plus some existing parasites – the tick and the flea that affect the functionality of the chip. All these findings could be used as a basis for future studies as a way of enhancing the security right away from our BIOs systems.
The acceptance of this paper is one of the best things that has ever happened in the world, and in the information technology domain specifically. This is so because many high-tech consumers have little understanding of the BIOS system and never pay any attention to the specifications whenever purchasing these devices. They are likened to a driver who never understands the functionality of their engines. The importance of BIOs can never be underestimated in the present day computing world since it is the first code that runs on our Central Processing Units, nobody has the integrity of checking the system thus making it a long time place for backdoors to reside from, and BIOS overwriting could lead to highly annoying situations, time-consuming and hard to recover from attacks from a virus known as CIV Virus (John, et al., 2017).
Though the two papers are addressing one topic- computer security, the difference in the research and the approach is imminent. The paper on academics revolves around surveys, interaction with the masses to see what they like most as far as passwords are concerned, and analyzing their responses, and then relating them to other studies and researchers conducted in the same line of study. The former on the other hand, describes a more technical research that does not involve huge masses, only the researchers in lab settings, and using software applications so as to establish their facts and arguments from their findings. However, the two describe and bring to us the realization that as much as we keep on developing technologically, we must remain vigilant and mindful of our security, right away from our PC and high tech devices hardware components to the many social media accounts we subscribe to and the information systems that we transact our businesses with.
References
Anna, B., n.d. The Importance of Strong, Secure Passwords. [Online]
Available at: https://www.securedatarecovery.com/resources/the-importance-of-strong-secure-passwords
[Accessed 25 May 2018].
Anne, A. & Martina, A. S., 1999. Users are not the Enemy. s.l.:s.n.
Cormac, H., 2009. So Long, and no Thanks for the Externalities: The Rational Rejection of SecurityAdvice by users. s.l., s.n., pp. 133-144.
Cynthia, K., Sasha, R. & Lorrie, f. C., 2006. Human Selection on Mnemonic Phrase-Based Passwords. New York, USA, ACM, pp. 67-68.
Dinei, F. & Cormac, H., 2007. A Large-Scale Study of Password Habits. Banff, Alberta, Canada, s.n., pp. 657-665.
John, B., Corey, K. & Xeno, K., 2017. BIOS Chronomancy: Fixing the Core Root of Trust for Measurement. [Online]
Available at: https://media.blackhat.com/us-13/US-13-Butterworth-BIOS-Security-Slides.pdf
[Accessed 25 May 2018].
John, B., Corey, K., Xeno, K. & Amy, H., 2017. Problems with the Static Root of Trust for Measurement. [Online]
Available at: https://media.blackhat.com/us-13/US-13-Butterworth-BIOS-Security-WP.pdf
[Accessed 25 May 2018].
Regenscheid, A. & Scarfone, K., 2011. BIOS Integrity Measurement Guidelines (Draft), s.l.: s.n.
Rick, W., Emilee, R., Ruthie, B. & Zac, W., 2016. Understanding Password Choices: How Frequently Entered Passwords are Re-used Across websites. s.l., s.n., pp. 175-188.
Sansar Choinyambuu, 2011. A Root of Trust for Measurement Mitigating the Lying Endpoint Problem of TNC, Oberseestrasse 10: Hochschule Rapperswil.
Sarah, P. et al., 2017. Let’s Go in for a Closer Look: Observing Passwords in Their Natural Habitat. Dallas, TX, USA, ACM, pp. 296-309.
Trusted Boot (tboot), 2010. an open source pre-kernel/ VMM module that uses Intel(R) Trusted Execution Technology (Intel(R)) to perform a measured and verified launch of an OS Kernel/VMM, s.l.: s.n.
Trusted Computing Group, 2012. TPM PC Client Specific Implementation Specification for Conventional BIOS. Version 1.21. [Online]
[Accessed 24 May 2018].
Trusted Computing Group, n.d. Trusted Computing Group. [Online]
Available at: https://www.trustedcomputinggroup.org/
[Accessed 24 May 2018].
US-CERT, 2016. Choosing and Protecting Passwords. [Online]
Available at: https://www.us-cert.gov/ncas/tips/ST04-002
[Accessed 24 May 2018].
William, M. et al., 2016. Fast Learn, and Accurate: Modelling Password guessability using neural networks. Austin, TX, s.n., pp. 175-191.