Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
Trusted Computing
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Criticism== The [[Electronic Frontier Foundation]] and the [[Free Software Foundation]] criticize that trust in the underlying companies is not deserved and that the technology puts too much power and control into the hands of those who design systems and software. They also state that it may cause consumers to lose anonymity in their online interactions, as well as mandating technologies Trusted Computing opponents say are unnecessary. They suggest Trusted Computing as a possible enabler for future versions of [[mandatory access control]], [[copy protection]], and DRM. Some security experts, such as [[Alan Cox (computer programmer)|Alan Cox]]<ref>{{cite news | title = Trusted Computing comes under attack | url = https://www.zdnet.com/article/trusted-computing-comes-under-attack/ | work = ZDNet | first = Ingrid | last = Marson | date = 2006-01-27 | access-date = 2021-09-12 }}</ref> and [[Bruce Schneier]],<ref name = "Schneier">{{cite news | url = http://www.schneier.com/crypto-gram-0208.html#1 | title = Palladium and the TCPA | date = 2002-08-15 | work = Crypto-Gram Newsletter | author = Schneier, Bruce | access-date = 2007-02-07 | author-link = Bruce Schneier }}</ref> have spoken out against Trusted Computing, believing it will provide computer manufacturers and software authors with increased control to impose restrictions on what users are able to do with their computers. There are concerns that Trusted Computing would have an [[Anti-competitive practices|anti-competitive]] effect on the IT market.<ref name = "Anderson"/> There is concern amongst critics that it will not always be possible to examine the hardware components on which Trusted Computing relies, the [[Trusted Platform Module]], which is the ultimate hardware system where the core 'root' of trust in the platform has to reside.<ref name = "Anderson"/> If not implemented correctly, it presents a security risk to overall platform integrity and protected data. The specifications, as published by the [[Trusted Computing Group]], are open and are available for anyone to review. However, the final implementations by commercial vendors will not necessarily be subjected to the same review process. In addition, the world of cryptography can often move quickly, and that hardware implementations of algorithms might create an inadvertent obsolescence. Trusting networked computers to controlling authorities rather than to individuals may create [[digital imprimatur]]s. Cryptographer [[Ross J. Anderson|Ross Anderson]], University of Cambridge, has great concerns that:<ref name = "Anderson">{{cite web | url = http://www.cl.cam.ac.uk/~rja14/tcpa-faq.html | title = 'Trusted Computing' Frequently Asked Questions: TC / TCG / LaGrande / NGSCB / Longhorn / Palladium / TCPA Version 1.1 |date=August 2003 | author = Anderson, Ross | access-date = 2007-02-07 | author-link = Ross J. Anderson }}</ref> <blockquote>TC can support remote censorship [...] In general, digital objects created using TC systems remain under the control of their creators, rather than under the control of the person who owns the machine on which they happen to be stored [...] So someone who writes a paper that a court decides is defamatory can be compelled to censor it β and the software company that wrote the word processor could be ordered to do the deletion if she refuses. Given such possibilities, we can expect TC to be used to suppress everything from pornography to writings that criticize political leaders.</blockquote> He goes on to state that: <blockquote>[...] software suppliers can make it much harder for you to switch to their competitors' products. At a simple level, Word could encrypt all your documents using keys that only Microsoft products have access to; this would mean that you could only read them using Microsoft products, not with any competing word processor. [...]</blockquote> <blockquote>The [...] most important benefit for Microsoft is that TC will dramatically increase the costs of switching away from Microsoft products (such as Office) to rival products (such as [[OpenOffice.org|OpenOffice]]). For example, a law firm that wants to change from Office to OpenOffice right now merely has to install the software, train the staff and convert their existing files. In five years' time, once they have received TC-protected documents from perhaps a thousand different clients, they would have to get permission (in the form of signed digital certificates) from each of these clients in order to migrate their files to a new platform. The law firm won't in practice want to do this, so they will be much more tightly locked in, which will enable Microsoft to hike its prices.</blockquote> Anderson summarizes the case by saying: <blockquote>The fundamental issue is that whoever controls the TC infrastructure will acquire a huge amount of power. Having this single point of control is like making everyone use the same bank, or the same accountant, or the same lawyer. There are many ways in which this power could be abused.</blockquote> ===Digital rights management=== One of the early motivations behind trusted computing was a desire by media and software corporations for stricter DRM technology to prevent users from freely sharing and using potentially copyrighted or private files without explicit permission. An example could be downloading a music file from a band: the band's record company could come up with rules for how the band's music can be used. For example, they might want the user to play the file only three times a day without paying additional money. Also, they could use remote attestation to only send their music to a music player that enforces their rules: sealed storage would prevent the user from opening the file with another player that did not enforce the restrictions. Memory curtaining would prevent the user from making an unrestricted copy of the file while it is playing, and secure output would prevent capturing what is sent to the sound system. ===Users unable to modify software=== A user who wanted to switch to a competing program might find that it would be impossible for that new program to read old data, as the information would be "[[vendor lock-in|locked in]]" to the old program. It could also make it impossible for the user to read or modify their data except as specifically permitted by the software. ===Users unable to exercise legal rights=== The law in many countries allows users certain rights over data whose copyright they do not own (including text, images, and other media), often under headings such as [[fair use]] or [[public interest]]. Depending on jurisdiction, these may cover issues such as [[whistleblowing]], production of evidence in court, quoting or other small-scale usage, [[backup]]s of owned media, and making a copy of owned material for personal use on other owned devices or systems. The steps implicit in trusted computing have the practical effect of preventing users exercising these legal rights.<ref name=Stallman13 /> ===Users vulnerable to vendor withdrawal of service=== A service that requires external validation or permission - such as a music file or game that requires connection with the vendor to confirm permission to play or use - is vulnerable to that service being withdrawn or no longer updated. A number of incidents have already occurred where users, having purchased music or video media, have found their ability to watch or listen to it suddenly stop due to vendor policy or cessation of service,<ref name="ms_drm">{{cite web|last=Cheng |first=Jacqui |url=https://arstechnica.com/information-technology/2008/04/drm-sucks-redux-microsoft-to-nuke-msn-music-drm-keys/ |title=DRM sucks redux: Microsoft to nuke MSN Music DRM keys |website=Ars Technica |date=2008-04-22 |access-date=2014-05-31}}</ref><ref>{{cite web|url=http://www.fudzilla.com/home/item/3495-yahoo-drm-servers-going-away?tmpl=component&print=1 |title=Yahoo! DRM servers going away |publisher=Fudzilla.com |date=2008-07-29 |access-date=2014-05-31}}</ref><ref>{{cite web|last=Fisher |first=Ken |url=https://arstechnica.com/tech-policy/2007/08/google-selleth-then-taketh-away-proving-the-need-for-drm-circumvention/ |title=Google selleth then taketh away, proving the need for DRM circumvention |website=Ars Technica |date=2007-08-13 |access-date=2014-05-31}}</ref> or server inaccessibility,<ref>{{cite web|last=Fister |first=Mister |url=http://www.shacknews.com/article/62995/ubisoft-offers-free-goodies-as |title=Ubisoft Offers Free Goodies as Compensation f - Video Game News, Videos and File Downloads for PC and Console Games at |date=26 March 2010 |publisher=Shacknews.com |access-date=2014-05-31}}</ref> at times with no compensation.<ref>{{cite web|last=Bangeman |first=Eric |url=https://arstechnica.com/uncategorized/2007/11/major-league-baseballs-drm-change-strikes-out-with-fans/ |title=Major League Baseball's DRM change strikes out with fans |website=Ars Technica |date=2007-11-07 |access-date=2014-05-31}}</ref> Alternatively in some cases the vendor refuses to provide services in future which leaves purchased material only usable on the present -and increasingly obsolete- hardware (so long as it lasts) but not on any hardware that may be purchased in future.<ref name="ms_drm" /> ===Users unable to override=== Some opponents of Trusted Computing advocate "owner override": allowing an owner who is confirmed to be physically present to allow the computer to bypass restrictions and use the secure I/O path. Such an override would allow remote attestation to a user's specification, e.g., to create certificates that say Internet Explorer is running, even if a different browser is used. Instead of preventing software change, remote attestation would indicate when the software has been changed without owner's permission. [[Trusted Computing Group]] members have refused to implement owner override.<ref>{{cite magazine | url = http://www.linuxjournal.com/article/7055 | title = Give TCPA an Owner Override | magazine = Linux Journal | author = Schoen, Seth | date = 2003-12-01 | access-date = 2007-02-07 | author-link = Seth Schoen }}</ref> Proponents of trusted computing believe that owner override defeats the trust in other computers since remote attestation can be forged by the owner. Owner override offers the security and enforcement benefits to a machine owner, but does not allow them to trust other computers, because their owners could waive rules or restrictions on their own computers. Under this scenario, once data is sent to someone else's computer, whether it be a diary, a DRM music file, or a joint project, that other person controls what security, if any, their computer will enforce on their copy of those data. This has the potential to undermine the applications of trusted computing to enforce DRM, control cheating in online games and attest to remote computations for [[grid computing]]. ===Loss of anonymity=== Because a Trusted Computing equipped computer is able to uniquely attest to its own identity, it will be possible for vendors and others who possess the ability to use the attestation feature to zero in on the identity of the user of TC-enabled software with a high degree of certainty. Such a capability is contingent on the reasonable chance that the user at some time provides user-identifying information, whether voluntarily, indirectly, or simply through inference of many seemingly benign pieces of data. (e.g. search records, as shown through simple study of the AOL search records leak<ref>{{cite news | url = https://www.nytimes.com/2006/08/09/technology/09aol.html?pagewanted=all&_r=0 | title = A Face Is Exposed for AOL Searcher No. 4417749 | date = 2006-08-09 | access-date = 2013-05-10 | newspaper = The New York Times }}</ref>). One common way that information can be obtained and linked is when a user registers a computer just after purchase. Another common way is when a user provides identifying information to the website of an affiliate of the vendor. While proponents of TC point out that online purchases and credit transactions could potentially be more secure as a result of the remote attestation capability, this may cause the computer user to lose expectations of anonymity when using the Internet. Critics point out that this could have a chilling effect on political free speech, the ability of journalists to use anonymous sources, whistle blowing, political blogging and other areas where the public needs protection from retaliation through anonymity. The TPM specification offers features and suggested implementations that are meant to address the anonymity requirement. By using a third-party Privacy Certification Authority (PCA), the information that identifies the computer could be held by a trusted third party. Additionally, the use of [[direct anonymous attestation]] (DAA), introduced in TPM v1.2, allows a client to perform attestation while not revealing any personally identifiable or machine information. The kind of data that must be supplied to the TTP in order to get the trusted status is at present not entirely clear, but the TCG itself admits that "attestation is an important TPM function with significant privacy implications".<ref>TPM version 1.2 specifications changes, 16.04.04</ref> It is, however, clear that both static and dynamic information about the user computer may be supplied (Ekpubkey) to the TTP (v1.1b),<ref name="ReferenceA">TPM v1.2 specification changes, 2004</ref> it is not clear what data will be supplied to the βverifierβ under v1.2. The static information will uniquely identify the endorser of the platform, model, details of the TPM, and that the platform (PC) complies with the TCG specifications . The dynamic information is described as software running on the computer.<ref name="ReferenceA"/> If a program like Windows is registered in the user's name this in turn will uniquely identify the user. Another dimension of privacy infringing capabilities might also be introduced with this new technology; how often you use your programs might be possible information provided to the TTP. In an exceptional, however practical situation, where a user purchases a pornographic movie on the Internet, the purchaser nowadays, must accept the fact that he has to provide credit card details to the provider, thereby possibly risking being identified. With the new technology a purchaser might also risk someone finding out that he (or she) has watched this pornographic movie 1000 times. This adds a new dimension to the possible privacy infringement. The extent of data that will be supplied to the TTP/Verifiers is at present not exactly known, only when the technology is implemented and used will we be able to assess the exact nature and volume of the data that is transmitted. ===TCG specification interoperability problems=== Trusted Computing requests that all software and hardware vendors will follow the technical specifications released by the [[Trusted Computing Group]] in order to allow interoperability between different trusted software stacks. However, since at least mid-2006, there have been interoperability problems between the TrouSerS trusted software stack (released as open source software by [[IBM]]) and [[Hewlett-Packard]]'s stack.<ref>{{cite web | work = TrouSerS FAQ | url = http://trousers.sourceforge.net/faq.html#1.7 | title = 1.7 - I've taken ownership of my TPM under another OS... | access-date = 2007-02-07 }}</ref> Another problem is that the technical specifications are still changing, so it is unclear which is the standard implementation of the trusted stack. ===Shutting out of competing products=== People have voiced concerns that trusted computing could be used to keep or discourage users from running software created by companies outside of a small industry group. [[Microsoft]] has received a great deal{{vague|date=March 2015}} of bad press surrounding their [[NGSCB|Palladium]] software architecture, evoking comments such as "Few pieces of vaporware have evoked a higher level of fear and uncertainty than Microsoft's Palladium", "Palladium is a plot to take over cyberspace", and "Palladium will keep us from running any software not personally approved by Bill Gates".<ref>{{cite journal |last1=Felten |first1=E.W. |title=Understanding trusted computing: will its benefits outweigh its drawbacks? |journal=[[IEEE Security & Privacy]] |date=May 2003 |volume=1 |issue=3 |pages=60β62 |doi=10.1109/MSECP.2003.1203224}}</ref> The concerns about trusted computing being used to shut out competition exist within a broader framework of consumers being concerned about using [[Product bundling|bundling]] of products to obscure prices of products and to engage in [[anti-competitive practices]].<ref name="anderson2"/> Trusted Computing is seen as harmful or problematic to independent and [[Open-source software|open source]] software developers.<ref>{{Cite journal|url=https://ieeexplore.ieee.org/document/1423956 |doi=10.1109/MSP.2005.40 |s2cid=688158 |title=Does Trusted Computing Remedy Computer Security Problems? |date=2005 |last1=Oppliger |first1=R. |last2=Rytz |first2=R. |journal=IEEE Security and Privacy Magazine |volume=3 |issue=2 |pages=16β19 |url-access=subscription }}</ref> ===Trust=== In the widely used [[public-key cryptography]], creation of keys can be done on the local computer and the creator has complete control over who has access to it, and consequentially their own [[security policy|security policies]].<ref>[http://grouper.ieee.org/groups/1363/ "IEEE P1363: Standard Specifications For Public-Key Cryptography", Retrieved March 9, 2009.] {{Webarchive|url=https://web.archive.org/web/20141201024245/http://grouper.ieee.org/groups/1363/ |date=December 1, 2014 }}</ref> In some proposed encryption-decryption chips, a private/public key is permanently embedded into the hardware when it is manufactured,<ref>{{Cite web|url=https://doi.org/10.1145/945445.945464|title=Terra: a virtual machine-based platform for trusted computing|first1=Tal|last1=Garfinkel|first2=Ben|last2=Pfaff|first3=Jim|last3=Chow|first4=Mendel|last4=Rosenblum|first5=Dan|last5=Boneh|date=October 19, 2003|publisher=Association for Computing Machinery|pages=193β206|via=ACM Digital Library|doi=10.1145/945445.945464|s2cid=156799 }}</ref> and hardware manufacturers would have the opportunity to record the key without leaving evidence of doing so. With this key it would be possible to have access to data encrypted with it, and to authenticate as it.<ref>These are the functions of the private key in [http://www.di-mgt.com.au/rsa_alg.html the RSA algorithm]</ref> It is trivial for a manufacturer to give a copy of this key to the government or the software manufacturers, as the platform must go through steps so that it works with authenticated software. Therefore, to trust anything that is authenticated by or encrypted by a TPM or a Trusted computer, an [[end user]] has to trust the company that made the chip, the company that designed the chip, the companies allowed to make software for the chip, and the ability and interest of those companies not to compromise the whole process.<ref>{{cite web |last1=Sullivan |first1=Nick |title=Deploying TLS 1.3: the great, the good and the bad (33c3) |url=https://www.youtube.com/watch?time_continue=1533&v=0opakLwtPWk |website=media.ccc.de |date=27 December 2016 |publisher=YouTube |access-date=30 July 2018}}</ref> A security breach breaking that chain of trust happened to a [[SIM card]] manufacturer [[Gemalto]], which in 2010 was infiltrated by US and British spies, resulting in compromised security of cellphone calls.<ref>{{Cite web |url = https://firstlook.org/theintercept/2015/02/19/great-sim-heist |title = The Great SIM Heist: How Spies Stole the Keys to the Encryption Castle |date = 2015-02-19 |access-date = 2015-02-27 |website = firstlook.org}}</ref> It is also critical that one be able to trust that the hardware manufacturers and software developers properly implement trusted computing standards. Incorrect implementation could be hidden from users, and thus could undermine the integrity of the whole system without users being aware of the flaw.<ref name="schoen-promise-risk">[http://pascal.case.unibz.it/handle/2038/871 Seth Schoen, "Trusted Computing: Promise and Risk", ''COSPA Knowledge Base: Comparison, selection, & suitability of OSS'', April 11th, 2006.] {{Webarchive|url=https://web.archive.org/web/20090319043100/http://pascal.case.unibz.it/handle/2038/871 |date=2009-03-19 }}</ref>
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)