Open main menu
Home
Random
Recent changes
Special pages
Community portal
Preferences
About Wikipedia
Disclaimers
Incubator escapee wiki
Search
User menu
Talk
Dark mode
Contributions
Create account
Log in
Editing
ELIZA effect
(section)
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
== Significance == The discovery of the ELIZA effect was an important development in [[artificial intelligence]], demonstrating the principle of using [[Social engineering (security)|social engineering]] rather than explicit programming to pass a [[Turing test]].<ref name="Trappl2002">{{cite book|title=Emotions in Humans and Artifacts|last1=Trappl|first1=Robert|last2=Petta|first2=Paolo|last3=Payr|first3=Sabine|page=353|year=2002|isbn=978-0-262-20142-1|quote=The "Eliza effect" β the tendency for people to treat programs that respond to them as if they had more intelligence than they really do (Weizenbaum 1966) is one of the most powerful tools available to the creators of virtual characters.|url=https://books.google.com/books?id=jTgMIhy6YZMC&pg=PA353|publisher=MIT Press|location=Cambridge, Mass.}}</ref> ELIZA convinced some users into thinking that a machine was human. This shift in human-machine interaction marked progress in technologies emulating human behavior. Two groups of chatbots are distinguished by William Meisel as "general [[personal assistant]]s" and "specialized digital assistants".<ref name=":0">{{Cite journal|last=Dale|first=Robert|date=September 2016|title=The return of the chatbots|journal=Natural Language Engineering|language=en|volume=22|issue=5|pages=811β817|doi=10.1017/S1351324916000243|issn=1351-3249|doi-access=free}}</ref> General digital assistants have been integrated into personal devices, with skills like sending messages, taking notes, checking calendars, and setting appointments. Specialized digital assistants "operate in very specific domains or help with very specific tasks".<ref name=":0" /> Weizenbaum considered that not every part of the human thought could be reduced to logical formalisms and that "there are some acts of thought that ought to be attempted only by humans".<ref>{{Cite book |last=Weizenbaum |first=Joseph |url=https://www.worldcat.org/oclc/1527521 |title=Computer power and human reason : from judgment to calculation |date=1976 |publisher=W. H. Freeman and Company |isbn=0-7167-0464-1 |location=San Francisco, Cal. |oclc=1527521}}</ref> When chatbots are [[Anthropomorphism|anthropomorphized]], they tend to portray gendered features as a way through which we establish relationships with the technology. "Gender stereotypes are instrumentalised to manage our relationship with chatbots" when human behavior is programmed into machines.<ref>[https://2018.xcoax.org/pdf/xCoAx2018-Costa.pdf Costa, Pedro. Ribas, Luisa. Conversations with ELIZA: on Gender and Artificial Intelligence. From (6th Conference on Computation, Communication, Aesthetics & X 2018) Accessed February 2021]</ref> Feminized labor, or [[women's work]], automated by anthropomorphic digital assistants reinforces an "assumption that women possess a natural affinity for service work and emotional labour".<ref>Hester, Helen. 2016. "Technology Becomes Her." New Vistas 3 (1):46-50.</ref> In defining our proximity to digital assistants through their human attributes, chatbots become gendered entities.
Edit summary
(Briefly describe your changes)
By publishing changes, you agree to the
Terms of Use
, and you irrevocably agree to release your contribution under the
CC BY-SA 4.0 License
and the
GFDL
. You agree that a hyperlink or URL is sufficient attribution under the Creative Commons license.
Cancel
Editing help
(opens in new window)