Skip to main content

Home/ Flat Classroom Project/ Group items tagged computers

Rss Feed Group items tagged

1More

Home - MSc Mobile and Ubiquitous Computing : Trinity College Dublin - 0 views

  •  
    "Mobile computing allows people to make use of computing and information systems without being tied to a desktop computer located in their office, classroom, or home. People can now make use of computer systems while on the move, whether waiting for a flight in some airport departure lounge, drinking coffee in their favorite cafe, simply driving home, or even just walking down the street. Thanks to the improved portability and processing power of laptop computers, Personal Digital Assistants, and even mobile phones, as well as improved battery life and the near universal coverage of wireless data communications networks, mobile computer users can now make use of almost the same range of services as desktop users. While the use of current mobile computers often follows the traditional pattern of a single user interacting with their dedicated computer via its own display and keyboard, mobile computing is still at an early stage of development. In his seminal paper on the computer for the 21st century written in 1991†, Marc Weiser noted that "The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it." Weiser put forward a vision of future computer systems in which "computers themselves vanish into the background". In doing so he inspired a field of research known as ubiquitous computing . In the ubiquitous computing vision, interconnected (mobile) computers are embedded unobtrusively in everyday appliances and environments and co-operate to provide information and services on behalf of their users. The ubiquitous computing vision is now becoming a reality enabled by recent and expected developments in new sensor technologies - increasing the range of stimuli that can be effectively sensed, by wireless networking - allowing mobile computer systems to co-operate, by miniaturization of computational devices - allowing massive deployment of sensor-based systems in every
9More

Resource #2 - 0 views

  • The first computers, constructed during World War II, employed radio valves, which were switched on and off to represent binary digits. But soon thereafter, the semiconductor was invented; it used much less electricity and thus did not overheat so easily, and it was sturdier. (V. Ramamurti, an Indian scientist, believed that the semiconductor was invented because the Allies feared the loss to Japan of India, the Allies' prime source of mica, which was essential to the making of radio valves.) Technological development of computers and of their multifarious applications has since been driven by the progressive reduction in the size and cost of semiconductors.
  • The first computers in the 1940s were as big as a house; by the 1960s, however, miniaturization of semiconductors had made it possible to create computers that were no bigger than a small room. At that point, IBM began to make a series of standardized computers; its 1620 and 360 series of mainframe computers found users all over the world, including India. The Indian government imported a few computers from the Soviet Union, especially EVS EM, its IBM 360 clone; but they were not popular, even in the government establishments where they were installed. IBM computers dominated the market. They were used for calculation, accounting and data storage in large companies, and in research laboratories. Tata Consultancy Services, India's largest software producer, was established in 1968 to run the computers acquired by the Tata group and to develop uses for them.
  • By the 1980s, computer chips were becoming small enough to be embodied in almost portable minicomputers, and these were getting cheap enough to be used in small businesses. Manufacturers began to build into minicomputers a selection of programs that performed the most common operations, such as word processing, calculation, and accounting. Over the 1980s, the mini-computers shrank in size and weight and were transformed into personal computers (PCs). Indian agents who sold imported minicomputers and PCs also employed software engineers for sales assistance and service. Thus, in the latter half of 1980s, Indian software engineers were scattered. Some worked in CMC; others serviced the surviving IBM machines in companies, government establishments, and research facilities; and still others serviced minicomputers and PCs.
  • ...6 more annotations...
  • By 1985 satellite links made the export of software possible without having to send programmers abroad. At that time, however, the Indian government did not allow private links, so Texas Instruments gave it the equipment, which it then proceeded to use from its Bangalore establishment. IBM, which wanted to set up a link in 1988, ran into the same problem: the government insisted on retaining its monopoly in telecommunications, the rates offered by its Department of Telecommunications were exorbitant, and it was inexperienced in running Very Small Aperture Terminal (VSAT) links.
  • In 1991 the Department of Electronics broke this impasse, creating a corporation called Software Technology Parks of India (STPI) that, being owned by the government, could provide VSAT communications without breaching its monopoly. STPI set up software technology parks in different cities, each of which provided satellite links to be used by firms; the local link was a wireless radio link. In 1993 the government began to allow individual companies their own dedicated links, which allowed work done in India to be transmitted abroad directly. Indian firms soon convinced their American customers that a satellite link was as reliable as a team of programmers working in the clients' office.
  • In the 1980s, an importer of hardware had to get an import license from the chief controller of imports and exports, who in turn required a no-objection certificate from the Department of Electronics. That meant going to Delhi, waiting for an appointment, and then trying to persuade an uncooperative bureaucrat. In 1992 computers were freed from import licensing, and import duties on them were reduced.
  • Satellites and import liberalization thus made offshore development possible, with a number of implications: It enabled firms to take orders for complete programs, to work for final clients and to market their services directly. Work for final clients also led firms to specialize in work for particular industries or verticals: it led in particular to India's specialization in software for banking, insurance, and airlines. It gave India a brand value and a reputation.
  • The late 1990s saw a surge in the Indian IT industry. To assure potential clients of their permanency, Indian software companies built large, expensive campuses, where they made working conditions as attractive as possible, to help them retain workers. Trees grew and streams flowed inside buildings, and swimming pools, badminton courts, meditation rooms, auditoriums, and restaurants were provided.
  • The IT boom in the United States was the source of India's software exports.
3More

Laptop - Wikipedia, the free encyclopedia - 0 views

  • laptop is a personal computer designed for mobile use that is small and light enough for a person to rest on their lap.
  • As the personal computer became feasible in the early 1970s, the idea of a portable personal computer followed. A "personal, portable information manipulator" was imagined by Alan Kay at Xerox PARC in 1968,[2] and described in his 1972 paper as the "Dynabook".[3]
  • The IBM SCAMP project (Special Computer APL Machine Portable), was demonstrated in 1973. This prototype was based on the PALM processor (Put All Logic In Microcode). The IBM 5100, the first commercially available portable computer, appeared in September 1975, and was based on the SCAMP prototype.[4] As 8-bit CPU machines became widely accepted, the number of portables increased rapidly. The Osborne 1, released in 1981, used the Zilog Z80 and weighed 23.6 pounds (10.7 kg). It had no battery, a 5 in (13 cm) CRT screen, and dual 5.25 in (13.3 cm) single-density floppy drives. In the same year the first laptop-sized portable computer, the Epson HX-20, was announced.[5] The Epson had a LCD screen, a rechargeable battery, and a calculator-size printer in a 1.6 kg (3.5 lb) chassis. Both Tandy/RadioShack and HP also produced portable computers of varying designs during this period.[6][7] The first laptops using the flip form factor appeared in the early 1980s. The Dulmont Magnum was released in Australia in 1981–82, but was not marketed internationally until 1984–85. The $8,150 ($18,370 in current dollar terms) GRiD Compass 1100, released in 1982, was used at NASA and by the military among others. The Gavilan SC, released in 1983, was the first computer described as a "laptop" by its manufacturer[8] From 1983 onward, several new input techniques were developed and included in laptops, including the touchpad (Gavilan SC, 1983), the pointing stick (IBM ThinkPad 700, 1992) and handwriting recognition (Linus Write-Top,[9] 1987). Some CPUs, such as the 1990 Intel i386SL, were designed to use minimum power to increase battery life of portable computers, and were supported by dynamic power management features such as Intel SpeedStep and AMD PowerNow! in some designs. Displays reached VGA resolution by 1988 (Compaq SLT/286), and color screens started becoming a common upgrade in 1991 with increases in resolution and screen size occurring frequently until the introduction of 17"-screen laptops in 2003. Hard drives started to be used in portables, encouraged by the introduction of 3.5" drives in the late 1980s, and became common in laptops starting with the introduction of 2.5" and smaller drives around 1990; capacities have typically lagged behind physically larger desktop drives. Optical storage, read-only CD-ROM followed by writeable CD and later read-only or writeable DVD and Blu-Ray, became common in laptops soon in the 2000s.
2More

Multi-core processor - Wikipedia, the free encyclopedia - 0 views

  • In computing, a processor is the unit that reads and executes program instructions, which are fixed-length (typically 32 or 64 bit) or variable-length chunks of data. The data in the instruction tells the processor what to do. The instructions are very basic things like reading data from memory or sending data to the user display, but they are processed so rapidly that we experience the results as the smooth operation of a program. Processors were originally developed with only one core. The core is the part of the processor that actually performs the reading and executing of the instruction. Single-core processors can only process one instruction at a time. (To improve efficiency, processors commonly utilize pipelines internally, which allow several instructions to be processed together, however they are still consumed into the pipeline one at a time.) A multi-core processor is composed of two or more independent cores. One can describe it as an integrated circuit which has two or more individual processors (called cores in this sense).[1] Manufacturers typically integrate the cores onto a single integrated circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip package. A many-core processor is one in which the number of cores is large enough that traditional multi-processor techniques are no longer efficient — this threshold is somewhere in the range of several tens of cores — and probably requires a network on chip.
  •  
    In computing, a processor is the unit that reads and executes program instructions, which are fixed-length (typically 32 or 64 bit) or variable-length chunks of data. The data in the instruction tells the processor what to do. The instructions are very basic things like reading data from memory or sending data to the user display, but they are processed so rapidly that we experience the results as the smooth operation of a program. Processors were originally developed with only one core. The core is the part of the processor that actually performs the reading and executing of the instruction. Single-core processors can only process one instruction at a time. (To improve efficiency, processors commonly utilize pipelines internally, which allow several instructions to be processed together, however they are still consumed into the pipeline one at a time.) A multi-core processor is composed of two or more independent cores. One can describe it as an integrated circuit which has two or more individual processors (called cores in this sense).[1] Manufacturers typically integrate the cores onto a single integrated circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip package. A many-core processor is one in which the number of cores is large enough that traditional multi-processor techniques are no longer efficient - this threshold is somewhere in the range of several tens of cores - and probably requires a network on chip.
2More

ScienceDirect - Telematics and Informatics : Mobile computing and ubiquitous networking... - 0 views

  • With the rapidly increasing penetration of laptop computers, which are primarily used by mobile users to access Internet services, support of Internet services in a mobile environment become an increasing need. The opportunities emerging from these technologies give rise to new paradigms such as mobile computing and ubiquitous networking
  •  
    "With the rapidly increasing penetration of laptop computers, which are primarily used by mobile users to access Internet services, support of Internet services in a mobile environment become an increasing need. The opportunities emerging from these technologies give rise to new paradigms such as mobile computing and ubiquitous networking"
7More

Welcome to info.cern.ch - 0 views

shared by Ben Groll on 13 Oct 08 - Cached
  • CERN, the European Organization for Nuclear Research, is where it all began in March 1989. A physicist, Tim Berners-Lee, wrote a proposal for information management showing how information could be transferred easily over the Internet by using hypertext, the now familiar point-and-click system of navigating through information. The following year, Robert Cailliau, a systems engineer, joined in and soon became its number one advocate. The idea was to connect hypertext with the Internet and personal computers, thereby having a single information network to help CERN physicists share all the computer-stored information at the laboratory. Hypertext would enable users to browse easily between texts on web pages using links.
  • nfo.cern.ch was the address of the world's first-ever web site and web server, running on a NeXT computer at CERN. The first web page address was http://info.cern.ch/hypertext/WWW/TheProject.html, which centred on information regarding the WWW project. Visitors could learn more about hypertext, technical details for creating their own webpage, and even an explanation on how to search the Web for information. There are no screenshots of this original page and, in any case, changes were made daily to the information available on the page as the WWW project developed.
  •  
    This is about the first website used as World Wide Web.
  • ...2 more comments...
  •  
    This link tells about Tim Berners Lee and the first website he created. He created the first World Wide Web.
  •  
    CERN, the European Organization for Nuclear Research, is where it all began in March 1989. A physicist, Tim Berners-Lee, wrote a proposal for information management showing how information could be transferred easily over the Internet by using hypertext, the now familiar point-and-click system of navigating through information. The following year, Robert Cailliau, a systems engineer, joined in and soon became its number one advocate. The idea was to connect hypertext with the Internet and personal computers, thereby having a single information network to help CERN physicists share all the computer-stored information at the laboratory. Hypertext would enable users to browse easily between texts on web pages using links.
  •  
    "CERN, the European Organization for Nuclear Research, is where it all began in March 1989. A physicist, Tim Berners-Lee, wrote a proposal for information management showing how information could be transferred easily over the Internet by using hypertext, the now familiar point-and-click system of navigating through information. The following year, Robert Cailliau, a systems engineer, joined in and soon became its number one advocate. The idea was to connect hypertext with the Internet and personal computers, thereby having a single information network to help CERN physicists share all the computer-stored information at the laboratory. Hypertext would enable users to browse easily between texts on web pages using links."
  •  
    Welcome to info.cern.ch The website of the world's first-ever web server 1990 was a momentous year in world events. In February, Nelson Mandela was freed after 27 years in prison. In April, the space shuttle Discovery carried the Hubble Space Telescope into orbit. And in October, Germany was reunified.
7More

National Center for Supercomputing Applications - Wikipedia, the free encyclopedia - 0 views

  • The National Center for Supercomputing Applications (NCSA) is a state-federal partnership to develop and deploy national-scale cyberinfrastructure that advances science and engineering. NCSA operates as a unit of the University of Illinois at Urbana-Champaign but it provides high-performance computing resources to researchers across the country. Support for NCSA comes from the National Science Foundation, the state of Illinois, the University of Illinois, business and industry partners, and other federal agencies.
  • These centers were founded when a group of University of Illinois faculty, led by Larry Smarr, sent an unsolicited proposal to the National Science Foundation in 1983. The foundation announced funding for the supercomputer centers in 1985; the first supercomputer at NCSA came online in January 1986.
  • NCSA provides leading-edge computing, data storage, and visualization resources. NCSA computational and data environment implements a multi-architecture hardware strategy, deploying both clusters and shared memory systems to support high-end users and communities on the architectures best-suited to their requirements. Nearly 1,360 scientists, engineers and students used the computing and data systems at NCSA to support research in more than 830 projects. A list of NCSA hardware is available at NCSA Capabilities
  • ...3 more annotations...
  • Today NCSA is collaborating with IBM, under a grant from the National Science Foundation, to build [1] "Blue Waters," a supercomputer capable of performing 1 quadrillion calculations per second, a measure known as a petaflop. Blue Waters is due to come online in 2011.
  • The Mosaic web browser, the first popular graphical Web browser which played an important part in expanding the growth of the World Wide Web, was written by Marc Andreessen and Eric Bina at NCSA. Andreessen and Bina went on to develop the Netscape Web browser. Mosaic was later licensed to Spyglass,_Inc. which provided the foundation for Internet Explorer.
  • Initially, NCSA's administrative offices were in the Water Resources Building and employees were scattered across the campus. NCSA is now headquartered within its own building directly north of the Siebel Center for Computer Science, on the site of a former baseball field, Illini Field. NCSA's supercomputers remain at the Advanced Computation Building, but construction is now under way on a Petascale Computing Facility to house Blue Waters.
  •  
    The NCSA is a great stepping stone to the evolution of Web 2.0.
1More

Ubiquitous Computing in Education: Invisible Technology, Visible Impact; Teach Beyond Y... - 0 views

  •  
    Ubiquitous computing in education, as defined in this book, is teachers and students having access to technology (computing devices, the Internet, services) whenever and wherever they need it. In a world of ubiquitous computing, the technology is always accessible and is not the focus of learning. Rather, faculty and students are active partners in the learning process, and they decide not only what technology is needed but also what to learn and how best to create new knowledge.
1More

How cloud computing is impacting everyday life - Thoughts on Cloud - 0 views

  •  
    Useful list of how cloud computing impacts individuals and community. Some clear examples that show practical uses for cloud computing.
11More

Resource #1 - 0 views

  • In the 2000s the Internet grew to an astounding level not only in the number of people who regularly logged on to the World Wide Web (WWW) but in the speed and capability of its technology. By December 2009, 26 percent of the world’s population used the Internet and “surfed the web.
  • The rapid growth of Internet technology and usage had a drastic cultural effect on the United States. Although that impact was mostly positive, the WWW caused many social concerns. With financial transactions and personal information being stored on computer databases, credit-card fraud and identity theft were frighteningly common.
  • Hackers accessed private and personal information and used it for personal gain. Hate groups and terrorist organizations actively recruited online, and the threat remained of online terrorist activities ranging from planting computer viruses to potentially blowing up power stations by hacking computers that ran the machinery. Copyright infringement was a growing concern
  • ...8 more annotations...
  • At the turn of the century, most users accessed the Internet by a dial-up connection in which computers used modems to connect to other computers using existing telephone lines. Typical dial-up connections ran at 56 kilobytes per second.
  • raditional communications media such as telephone and television services were redefined by technologies such as instant messaging, Voice over Internet Protocol (VoIP), mobile smartphones, and streaming video.
  • The Internet changed the production, sale, and distribution of print publications, software, news, music, film, video, photography, and everyday products from soap to automobiles.
  • With broadband, Internet users could download and watch videos in a matter of seconds, media companies could offer live streaming-video newsfeeds, and peer-to-peer file sharing became efficient and commonplace. News was delivered on websites, blogs, and webfeeds, and e-commerce changed the way people shopped. Television shows, home movies, and feature films were viewed on desktop or laptop computers and even on cell phones. Students researched online, and many parents began working from home for their employers or started their own online businesses.
  • It was also becoming increasingly easy for users to access it from Internet cafés, Internet kiosks, access terminals, and web pay phones. With the advent of wireless, customers could connect to the Internet from virtually any place that offered remote service in the form of a wireless local area network (WLAN) or Wi-Fi router.
  • In January 2001 Apple launched the iPod digital music player, and then in April 2003 it opened the iTunes Store, allowing customers to legally purchase songs for 99 cents. Although federal courts ordered that music-sharing services such as Napster could be held liable if they were used to steal copyrighted works, Fanning’s brainchild realized the power of peer-to-peer file sharing and the potential success of user-generated Internet services.
  • Email was the general form of internet communication and allowed users to send electronic text messages. Users could also attach additional files containing text, pictures, or videos. Chat rooms and instant-messaging systems were also popular methods of online communication and were even quicker than traditional email. Broadband made other popular forms of Internet communication possible, including video chat rooms and video conferencing. Internet telephony or VoIP became increasingly popular f
  • or gaming applications.

Flatclassroom: Emerg - 0 views

started by Erin B on 27 Sep 10 no follow-up yet
1More

What cloud computing really means | Cloud Computing - InfoWorld - 1 views

  •  
    reasons for cloud computing; pros and cons
2More

Microsoft Windows - Wikipedia, the free encyclopedia - 0 views

  • Microsoft Windows is a series of software operating systems and graphical user interfaces produced by Microsoft. Microsoft first introduced an operating environment named Windows in November 1985 as an add-on to MS-DOS in response to the growing interest in graphical user interfaces (GUIs).[2] Microsoft Windows came to dominate the world's personal computer market, overtaking Mac OS, which had been introduced in 1984. As of October 2009, Windows had approximately 91% of the market share of the client operating systems for usage on the Internet.[3][4][5] The most recent client version of Windows is Windows 7; the most recent server version is Windows Server 2008 R2; the most recent mobile OS version is Windows Phone 7.
  •  
    Microsoft Windows is a series of software operating systems and graphical user interfaces produced by Microsoft. Microsoft first introduced an operating environment named Windows in November 1985 as an add-on to MS-DOS in response to the growing interest in graphical user interfaces (GUIs).[2] Microsoft Windows came to dominate the world's personal computer market, overtaking Mac OS, which had been introduced in 1984. As of October 2009, Windows had approximately 91% of the market share of the client operating systems for usage on the Internet.[3][4][5] The most recent client version of Windows is Windows 7; the most recent server version is Windows Server 2008 R2; the most recent mobile OS version is Windows Phone 7. Contents [hide]
7More

How workflow software is - 2 views

  • Workflow software provides a way for your computer to achieve its highest potential — to do work for you so you can get on with other important tasks.
  •  
    This is some really good information on why workflow software is needed and how it is useful.
  • ...3 more comments...
  •  
    Workflow software provides a way for your computer to achieve its highest potential - to do work for you so you can get on with other important tasks.
  •  
    Explanation of what workflow software is.
  •  
    Workflow software provides a way for your computer to achieve its highest potential - to do work for you so you can get on with other important tasks. Workflow software comes in many functionalities. Some of the most familiar functionalities do some of the most familiar computer tasks.
  •  
    This article discusses how workflow software provides a way for a computer to function higher than it would generally without a workflow software to assit it.
  •  
    This article explains what workflow software is and how it increases productivity in people. The article also tells the reader how workflow software appeals to many people in the tasks that it simplifies.
1More

Bill Kerr: notes on cloud computing - 1 views

  •  
    What is cloud computing? What are the differences between cloud computing, web2.0 and the internet?
9More

Scott Hyten - LinkedIn - 0 views

  • CEO at Wild Brain
  • the largest independent animation studio at Wild Brain
  • building more than 100 computer-generated television shows and music videos for the Walt Disney Company, Hyten has pioneered the use and integration of technology utilizing a worldwide supply chain while producing product for a global market.
  • ...4 more annotations...
  • He is featured in Pulitzer Prize-winning author Tom Friedman’s book, “The
  • World is Flat.
  • Indonesia
  • Over the last 25 years, Scott Hyten has either been a founding employee, founded, co-founded or provided startup capital for some of the world’s leading companies and practices, including technology practice Computer Sciences Corporation (NYSE:CSC) (Continuum outsourcing), the world’s leading healthcare technology practice at Perot Systems (NYSE:PER);
  •  
    worked for wild brain, an animation studio that created stuff for the Disney channel.
  •  
    Over the last 25 years, Scott Hyten has either been a founding employee, founded, co-founded or provided startup capital for some of the world's leading companies and practices, including technology practice Computer Sciences Corporation (NYSE:CSC) (Continuum outsourcing), the world's leading healthcare technology practice at Perot Systems (NYSE:PER); the largest independent animation studio at Wild Brain; and the world's leading managed hosting and internet broadcast compan at ThePlanet.com. Whether through managing 3-D Seismic exploration in the North Sea, Indonesia and Africa for Mobil Oil or building more than 100 computer-generated television shows and music videos for the Walt Disney Company, Hyten has pioneered the use and integration of technology utilizing a worldwide supply chain while producing product for a global market. He is featured in Pulitzer Prize-winning author Tom Friedman's book, "The World is Flat." and has latterly received the Albert Einstein Award for technology, Scott Hyten's Specialties: Technology, Entertainment, Digital Content Distribution and Music
1More

Computer History Museum - Timeline of Computer History - 0 views

  •  
    computer history timeline, can give us an idea of how much computers improved over the years and are now more digital.
2More

Eco-friendly computers | MNN - Mother Nature Network - 0 views

  • When it comes to considering eco-friendly computers, the environmental impact of PCs has never really factored much into purchasing decisions
  •  
    Eco Friendly Computers
1More

Associations of Leisure-Time Internet and Computer Use With Overweight and Obesity, Phy... - 0 views

  •  
    This article is about how people with less internet leisure time hada higher level of educational attainment and employment, then people with high leisure time on the computer or internet. Also that people with high internet use and leisure time on the computer are most likely overweight . This just shows how internet has had a major effect on leisure.
2More

Mobile device - Wikipedia, the free encyclopedia - 0 views

  • A mobile device (also known as a handheld device, handheld computer or simply handheld) is a pocket-sized computing device, typically having a display screen with touch input and/or a miniature keyboard. In the case of the personal digital assistant (PDA) the input and output are often combined into a touch-screen interface. Smartphones and PDAs are popular amongst those who require the assistance and convenience of certain aspects of a conventional computer, in environments where carrying one would not be practical.
  • andheld devices have become ruggedized for use in mobile field management situations to record information in the field. They are used to achieve a variety of tasks for increasing efficiency that include digitizing notes, sending and receiving invoices, asset management, recording signatures, managing parts and scanning barcodes.
1 - 20 of 177 Next › Last »
Showing 20 items per page