Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
Human Computer Interface
Post: #1


The Human-Computer Interface (HCI) deals with the methods by which computers and their users communicate. It is the process of designing interface software so that computers are pleasant, easy to use and do what people want them to do. Dealing with HCI requires the study of not only the hardware of the computer, but that of the human side also. Therefore attention must be paid to human psychology and physiology.

This is because to build a better two-way communication, one must know the capabilities and limitation of both sides. This seminars also deals with concepts and guidelines that should be followed in order to produce a good HCI. Specifically dealt with topics include Dialogue Design, Presentation Design, General Input and Output.


This section mainly deals with the way humans communicate.

The human brain is where all the cognitive functions take place. It is ultimately where a human receives, interprets and stores information. Information can be processed by the sense organs and sent to the brain faster and more precise than the brain can handle. Many models have been developed in order to try and use a computer analogy to brain functions but with mixed success. They are however quite useful because they present to us a model with which we can illustrate capabilities and limitations.

These models suggest that there are two forms of human memory: short term and long term. Each sense appears to have its own short-term memory, which acts like a buffer or staging area for input from the particular sense organ to the brain. Any memory that is not reinforced and moved to long-term memory is forgotten. Short term memory has a capacity of about seven blocks of information but this too seems to be able to be increased with practice and added levels of abstraction and association.

In order for information to be remembered it must be moved into long-term memory. This can be a conscious act as in deliberately memorizing something through repetition or unconscious as when a particularly interesting piece of data is retrieved and requires more thought. No maximum size of long-term memory has yet been determined. This aspect of memory and the fact that the human brain can only process so much information is important to the layout of an HCI. People sometimes describe a particular screen as "too busy". What this means is that there is too much information on the screen at once. The brain is incapable of taking in so much information at once and ambiguity and confusion results. Precision should be a primary concern for the HCI designer.

Important Note..!

If you are not satisfied with above reply ,..Please


So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page
Popular Searches: udev renamed interface, human computer dix, human resourc, best seminar on human computer interface, elections human, human transporter topic siminors, human computer interface seminar report,

Quick Reply
Type your reply to this message here.

Image Verification
Image Verification
(case insensitive)
Please enter the text within the image on the left in to the text box below. This process is used to prevent automated posts.

Possibly Related Threads...
Thread: Author Replies: Views: Last Post
  Computer Input and Output seminar class 2 2,239 22-10-2012 12:56 PM
Last Post: seminar details
  Computer Memory Based on the Protein Bacteriorhodopsin computer science crazy 8 7,355 18-02-2012 11:29 AM
Last Post: seminar paper
  Real-Time Visualization and Interaction of Threedimensional Human CT Images seminar class 0 1,018 05-05-2011 12:34 PM
Last Post: seminar class
  Digital Visual Interface computer science crazy 0 849 23-09-2008 01:04 AM
Last Post: computer science crazy
  Optical Computer computer science crazy 0 1,086 23-09-2008 12:47 AM
Last Post: computer science crazy
  Fiber Distributed Data Interface computer science crazy 0 942 23-09-2008 12:37 AM
Last Post: computer science crazy