User:Vtaylor/CIS2/Computer and information sciences/
| Students @ Work - a student collaborative writing project. |
Help us by providing feedback on the Discussion page.
CIS 2 Computers and the Internet in Society SPRING 2009 Final Projects
For project communication, draft document sharing, revision, final submission preparation and publication, we will be using space in the public WikiEducator wiki.
- 1 Computer and Information Sciences
- 2 Fields Applied
- 3 Social Effects
- 4 Issues Raising
- 5 Open the Future
- 6 Contributors
- 7 References
Computer and Information Sciences
Computer Science is the study of theory and methods of processing information in digital computers, the design of computer hardware and software, and the applications of computers. Computer sciences studies combine software and hardware to develop innovative products. Theories in computer sciences cover a broad spectrum of research that includes Computational complexity theory, programming language theory, Computer programming, Human-computer interaction.
Early development of computing was limited to military applications. Military uses computers to store and process data on personnel, weather, communications, finances, operate sophisticated equipment during combat and peace time maneuvers. As the computing power increased more complicated algorithms and simulations were developed. Now developers are concentrating on systems such as voice recognition and expert systems.
Military computer-based training system features simulation of operational strategies, diverse combat situations and planning of operations.
IBM Roadrunner, a super computer with 12,960 chips was built and developed by engineers and scientists from IBM and Los Alamos National Laboratory. This machine was twice as fast compared to BlueGene/L of IBM. The cost to build IBM Roadrunner was $133 million. The computer was developed to solve classified military problems and global climate models with higher accuracy. One day computing power of IBM Roadrunner was equivalent to 6 billion hand calculators working for 46 years.
Other military applications of Computer and Information Sciences include the following:
- Powered exoskeleton
- Bomb disposal Robots
- Unmanned ground vehicle
- Unmanned combat air vehicle
- Autonomous underwater vehicle
Advances in communication and information technology has changed face and future of business. Today, as information goes faster, the world seems to be smaller, and this has big implication for the way we do business. Storing information on computer rather than paper, hard material, for instance, has made easier the way we access information. Preparing documents by a computer with high quality, format than by hand. Email allows business to communicate and sending data or files quickly to everywhere in the world. In order to gain these benefits, business must adapt technology and setup their infrastructure.
Office equipment includes fax machines, computers, scanners, pagers, and conferencing capabilities (telephone, video, and satellite)
Cellphone likes BackBerry, O2, HTC, Palm, Iphone...applied in businesses as remote accessible to office resources such as email, documentation via POP3, IMAP, Microsoft Share Point, MS Exchange Server...Workforce became mobile, conducting business outside of the traditional office settings through the use of Personal Digital Assistants (PDAs), cellular phones and laptop computers. Easier access to the Internet allowed more employees to become "telecommuters," who conducted work-related activities either from their homes or from some other remote location.
Collaboration technologies, currently being enhanced by Microsoft and IBM, may enable companies to conduct "virtual meetings" in the near future. In a virtual meeting, employees from remote locations conduct real-time meetings from their own computers using peer-to-peer software. Participants can see one another on computer screens, share computer space and make changes to product designs or contract documents via a "virtual whiteboard." Besides it, we can not forget to talk about business system like ERP, SAP, CRM... which allow to centralize information and help making business decisions. In this fast-moving world, best decision making system help managers, directors with many feasible options, and move step forward to the best solution that bring as much benefits, reduce risks to their business. This is usually implemented in Finance, Capital Fund, and Investment firms. For whole-sale, logistics business, a Business Intelligent System like Cognos impacts daily operation of them in centralize information, database, and to draw a big view on business operation statistic, sale process, production that help business owner to make right decision.
"Computers are being used to document patient care, assist in the diagnosis and management of a variety of health conditions, measure clinical outcomes to improve quality of care, and in administrative and financial management decisions." Using computer system in health care to store and transfer patients information between hospital to improve quality for giving advice to solve clinical problems. Computer and database contain information which can be used to compare expected results and help physicians make decision. Computer system with high performance graphic features improve detection on diseases and cancer. Medical Microchip implant and Biosensor are being used as mental prostheses to compensate for a loss of normal function, to remotely monitor patients' vital signs, to control the delivery of medications, and to communicate with geographically distant health care professionals and the outside environment.
- Microchip Monitors Patient Drug Regimen.
- For heart patients, a defibrillator implanted is put in their heart that controls heart rhythm, records their heart's activity and then sends it off electronically to their doctor.
- “Many heart attack victims, especially women, experience nonspecific symptoms and secure medical help too late after permanent damage to the cardiac tissue has occurred,” says John T. McDevitt, principal investigator and designer of the nano-bio-chip. “Our tests promise to dramatically improve the accuracy and speed of cardiac diagnosis.”
- 'Body-on-a-Chip' reduce cost of developing new drugs - A new kind of microchip can host human cells to mimic the reaction of different tissues in the body. The chip could help reduce the need for animal testing, and lower the cost of developing new pharmaceuticals. Medical researchers are using it to study the effect of chemotherapy drugs on cancer cells.
Computer Aided Radiology, Computer Aided Surgery, Augmented Reality, Telemedicine, Robotized Teleoperating Systems
Computer-aided detection (CAD) for Mammography is a new and evolving topic in the realm of breast radiology.
Another area computers impacted on is the entertainment area. Most of the people out there like science fiction or action movies, especially the ones with the neat looking effects. Well, those graphics were mostly made by computers. Most of the movies today use computer graphics to make things more realistic but not real. Computer graphics are mostly used on spaceships, aliens, monsters, and special effects. To the left is a picture from the movie "Godzilla." Godzilla was created by computer animation, texturing, and graphics to make him more realistic then the older version when they used a man in a costume to play godzilla. This wasn't the only movie made with computers. Movies like Jurassic Park, Wing Commander, Starship Troopers, Star Wars SE, and the latest Star Trek Movies used computers to make them look more interesting and realistic. There are even movies completely made by computers like Toy Story and A Bugs' Life.
Not only movies used computer animations and graphics. Games on the latest game consoles like the PC, Playstation and Nintendo 64 used the computers to make the coolest games ever. That's how computers impacted today's entertainment
This is the "Age of Information Technology" and has had significant effects on our lives. Effects on Social Development and Relationships The use of home computers not only can influence children's cognitive and academic skills, but can also shape children's social interactions and development. In children's interactions with parents and other adult authority figures, one obvious effect has been the frequent reversal of the traditional parent–child relationship with the computer savvy child taking on the role of teacher to the parent. Several studies have found, for example, that teenagers are more likely to help their parents with computers than parents are to help their children—with boys disproportionately helping their fathers and girls disproportionately helping their mothers.41 In addition, some have hypothesized that the equality in online communications among computer users of all ages tends to erode authority structures, with the result that children will be less accepting of parental authority. (7)
Today, communication technologies blog, social network, cellphone, email play an importance role in society. Family, friend, business are able to communicate quickly than in the past. Internet has opened door of communication for a lot of people. People with disabilities meet barriers of all types. However, technology is helping to lower many of these barriers. By using computing technology for tasks such as reading and writing documents, communicating with others, and searching for information on the Internet, students and employees with disabilities are capable of handling a wider range of activities independently. Still, people with disabilities face a variety of barriers to computer use. These barriers can be grouped into three functional categories: barriers to providing computer input, interpreting output, and reading supporting documentation. Hardware and software tools (known as adaptive or assistive technologies) have been developed to provide functional alternatives to these standard operations
Beside the positive impact of computer on society, it also comes with the negative side:
- Fear of losing data or electronic files.
- Decrease in person-to-person contact.
- Less tolerance for errors.
- Frustration with co-workers who are not up to date.
- Overload of data and information.
- Less distinction between work and personal time.
- Fear of "Big Brother" watching.
Computer Ethics is a branch of practical philosophy which deals with how computing professionals should make decisions regarding professional and social conduct. The term "computer ethics" was first coined by Walter Maner in the mid-1970s, but only since the 1990s has it started being integrated into professional development programs in academic settings. The conceptual foundations of computer ethics are investigated by information ethics, a branch of philosophical ethics established by Luciano Floridi. Computer ethics is a very important topic in computer applications. Identifying ethical issues as they arise, as well as defining how to deal with them, has traditionally been problematic in computer ethics. Some have argued against the idea of computer ethics as a whole. However, Collins and Miller proposed a method of identifying issues in computer ethics in their Paramedic Ethics model. The model is a data-centered view of judging ethical issues, involving the gathering, analysis, negotiation, and judging of data about the issue.
In solving problems relating to ethical issues, Michael Davis proposed a unique problem-solving method. In Davis's model, the ethical problem is stated, facts are checked, and a list of options is generated by considering relevant factors relating to the problem. The actual action taken is influenced by specific ethical standards. (10)
Articles on Privacy: Today, many people rely on computers to do homework, work, and create or store useful information. Therefore, it's important for the information to be stored and kept properly. It's also extremely important to protect computers from data loss, misuse and abuse. For example, businesses need to keep their information secure and shielded from hackers. Home users also need to ensure their credit card numbers are secure when participating in online transactions. A computer security risk is any action that could cause loss of information to software, data, processing incompatibilities or damage to computer hardware.
An intentional breach in computer security is known as a computer crime, which is slightly different from a cybercrime. A cybercrime is known as illegal acts based on the Internet and is one of the FBI's top priorities. There are several distinct categories for people that perpetrate cybercrimes, and they are: hacker, cracker, cyberterrorist, cyberextortionist, unethical employee, script kiddie and corporate spy. A hacker is defined as someone who accesses a computer or computer network unlawfully. They often claim that they do this to find leaks in the security of a network. (11)
Economic, location, education and politics make a technology distance between persons who have more chance to reach technology than persons who have not. In rich, developed countries, this distance might be shorter than other countries because of social fair, increasing of living standard and mass-production. A new evolution technology often cost high for poor people but after a time, its cost become cheaper and every people would have chance to own one. However, in other poor countries, it seem too high to own, even old technology like car or laptop (in U.S, every households have at least one car but in other poor countries to own a car is something luxury and very expensive, also computer).
The lack of having or reaching new technology causes people in those place and countries become seem not exist in the world because they can not reach new information, technology, and maybe politics. Their daily living just focus to purpose of finding food and things to help their living and family. They don't ever make a phone call, access Internet, watch Television, they become people of the old age.
Links to Violent Behavior Raise Concerns
Although educational software for home computer use includes many games that encourage positive, pro-social behaviors by rewarding players who cooperate or share, the most popular entertainment software often involves games with competition and aggression,50 and the amount of aggression and violence has increased with each new generation of games.51 A content analysis of recent popular Nintendo and Sega Genesis computer games found that nearly 80% of the games had aggression or violence as an objective.52 One survey of seventh- and eighth-grade students found that half of their favorite games had violent themes.34 Yet parents often are unaware of even the most popular violent titles, despite the rating system from the Entertainment Software Ratings Board in place since September 1994 (see Box 3). In a 1998 survey, 80% of junior high students said they were familiar with Duke Nukem—a violent computer game rated "mature" (containing animated blood, gore, and violence and strong sexual content), but fewer than 5% of parents had heard of it. (8)
From BBC: Our “digital footprint” - the sharing of more and more aspects of our lives through digital photography, podcasting , blogging and video - is set to get bigger and this will raise key questions about how much information we should store about ourselves. The ever-present network will channel mass market information directly to us while disseminating our own intimate information. The report dubs this the era of so-called hyper-connectivity and predicts it will mean a growth in “techno-dependency”. This ever more intimate relationship between humans and computers will be a double-edged sword, it suggests… “Without proper consideration and control it is possible that we - both individually and collectively - may no longer be in control of ourselves or the world around us,”.
Open the Future
Moore's Laws - Future Trends
"The best way to predict the future is to invent it" Moore's law is a law in the computer hardware industry that describes how the manufacturing technology in hardware industry will evolve. Based on Moore's law the number of transistors on an integrated circuit has doubled every two years. This trend was noted by Gordon Moore of Intel and is known as Moore's law. The law has been holding true all the way from 1965 to 2005 and is not expected to stop at least until 2015. Because of this law electronic devices are now much more complex, cheaper, and easier to manufacture.
A quantum computer is a device for computation that makes direct use of quantum mechanical phenomena, such as superposition and entanglement, to perform operations on data. The basic principle behind quantum computation is that quantum properties can be used to represent data and perform operations on these data. If large-scale quantum computers can be built, they will be able to solve certain problems much faster than any of our current classical computers.
Today's computers use the movement of electrons in-and-out of transistors to do logic. Computers of the future may utilize crystals and metamaterials to control light. Optical computers make use of light particles called photons. Photonic computing is intended to use photons or light particles, produced by lasers, in place of electrons. Compared to electrons, photons are much faster – light travels about 30 cm, or one foot, in a nanosecond – and have a higher bandwidth. While photonic computing is still seen as impractical by many, research is being pushed along by strong market forces already implementing networking and, thus, creating opportunities. Recent years have seen the development of new conducting polymers which create transistor-like switches that are smaller, and 1,000 times faster, than silicon transistors.
By computing many different numbers simultaneously and then interfering the results to get a single answer, a quantum computer can perform a large number of operations in parallel and ends up being much more powerful than a digital computer of the same size.
Scientists aim to use nanotechnology to create nanorobots that will serve as antibodies that can be programmed. This will help to protect humans against pathogenic bacteria and viruses that keep mutating rendering many remedies ineffective against new strains. Nanorobots would overcome this problem by reprogramming selectively to destroy the new pathogens. Nanorobots are predicted to be part of the future of human medicine.
Artificial Intelligence & Robotics
Currently, AI applications focus on speech recognition, computer vision, text analysis and robot control; AI will play the importance role in future of computer and information sciences. In the near future, technologies in narrow fields such as speech recognition will continue to improve and will reach human levels. AI will be able to communicate with humans in unstructured English using text or voice, navigate (not perfectly) in an unprepared environment and will have some rudimentary common sense (and domain-specific intelligence).
The future of robotics has certainly been addressed several times over. From books to movies, virtually everyone has at least some idea of how they believe the future of robotics technology will play out. The future of robotics technology looks bright. It has been pondered however just how robotics technology will affect future generations. Robotics devices that can lead the blind, aid the elderly and even clean house when needed are all being considered although the timeline for these developments is far from being seen right now.
Looking into the past you can see that changes come greatly with innovation. It is assumed that robotics will play a large roll in law enforcement and security fields. As of today, robots are already being used for dangerous and hazardous tasks such as bomb disposal, hostage recovery and search and rescue operations.
Since there has already been a robot, names Sojourner, sent to operate on another planet, it makes the mind wonder if space construction, assembly and communications, even agricultural industry will be possible in the future with robotics in space.
Computer Like Human Brain
Computer that can work like human brain does not exist yet. The computational capability of the computer is increasingly at a very dramatic rate. Every year we are finding that the computers are becoming smaller, increasing in speed, and capable of handling larger banks of memory. Computers typically are measured in how many Millions of Instructions Per Second (MIPS) can be performed by them. A computer that can replicate human brain is expected to perform close to Billion MIPS. It is estimated that such a capability might be available by 2020 at a cost-effective price point.
Virtual education refers to instruction in a learning environment where teacher and student are separated by time or space, or both, and the teacher provides course content through course management applications, multimedia resources, the Internet, videoconferencing, etc. Students receive the content and communicate with the teacher via the same technologies. Virtual education is a term describing online education using the Internet. This term is primarily used in higher education where so-called Virtual Universities have been established.
Human-Computer Interaction (HCI), predicting that in little more than a decade, computers will be able to anticipate its users’ needs, and humans, in turn, will be able to interact semantically with computers.
- Vu Le
- Sapna Agrawal
- Potter, Johanson, and Hutinger; Creative Software Can Extend Children's Expressiveness; The Center for Best Practices in Early Childhood, College of Education and Human services at Western Illinois University, 2001
- The Impact of Computer - http://tatooine.fortunecity.com
- Nick Lemons. Computers Impact on Education, Business, Entertainment, history and future, and the Private Sector
- Alfred Lewis. "The New World of Computers. Dodd, Mead, and Company", New York, 1965
- Roger C. Schank and Peter G. Childers. The Cognitive Computer. Addison-Wesley Publishing
- www.geeks.com "Future of Computer Technology"
- www.washington.edu "Working Together: People with Disabilities and Computer Technology"