Skip to main content

Home/ Flat Classroom Project/ Group items tagged data

Rss Feed Group items tagged

Toni H.

Multi-core processor - Wikipedia, the free encyclopedia - 0 views

  • In computing, a processor is the unit that reads and executes program instructions, which are fixed-length (typically 32 or 64 bit) or variable-length chunks of data. The data in the instruction tells the processor what to do. The instructions are very basic things like reading data from memory or sending data to the user display, but they are processed so rapidly that we experience the results as the smooth operation of a program. Processors were originally developed with only one core. The core is the part of the processor that actually performs the reading and executing of the instruction. Single-core processors can only process one instruction at a time. (To improve efficiency, processors commonly utilize pipelines internally, which allow several instructions to be processed together, however they are still consumed into the pipeline one at a time.) A multi-core processor is composed of two or more independent cores. One can describe it as an integrated circuit which has two or more individual processors (called cores in this sense).[1] Manufacturers typically integrate the cores onto a single integrated circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip package. A many-core processor is one in which the number of cores is large enough that traditional multi-processor techniques are no longer efficient — this threshold is somewhere in the range of several tens of cores — and probably requires a network on chip.
  •  
    In computing, a processor is the unit that reads and executes program instructions, which are fixed-length (typically 32 or 64 bit) or variable-length chunks of data. The data in the instruction tells the processor what to do. The instructions are very basic things like reading data from memory or sending data to the user display, but they are processed so rapidly that we experience the results as the smooth operation of a program. Processors were originally developed with only one core. The core is the part of the processor that actually performs the reading and executing of the instruction. Single-core processors can only process one instruction at a time. (To improve efficiency, processors commonly utilize pipelines internally, which allow several instructions to be processed together, however they are still consumed into the pipeline one at a time.) A multi-core processor is composed of two or more independent cores. One can describe it as an integrated circuit which has two or more individual processors (called cores in this sense).[1] Manufacturers typically integrate the cores onto a single integrated circuit die (known as a chip multiprocessor or CMP), or onto multiple dies in a single chip package. A many-core processor is one in which the number of cores is large enough that traditional multi-processor techniques are no longer efficient - this threshold is somewhere in the range of several tens of cores - and probably requires a network on chip.
scott summerlin

Google - Wikipedia, the free encyclopedia - 1 views

  • Google Inc. (NASDAQ: GOOG, FWB: GGQ1) is a multinational public cloud computing, Internet search, and advertising technologies corporation
  • Google runs over one million servers in data centers around the world,[13] and processes over one billion search requests[14] and twenty petabytes of user-generated data every day.
  • Google runs over one million servers in data centers around the world,[14] and processes over one billion search requests[15] and twenty petabytes of user-generated data every day.[16][17][18] Google's rapid growth since its incorporation has triggered a chain of products, acquisitions and partnerships beyond the company's core search engine. The company offers online productivity software, such as its Gmail e-mail software, and social networking tools, including Orkut and, more recently, Google Buzz.
  •  
    "Google Inc. (NASDAQ: GOOG, FWB: GGQ1) is a multinational public cloud computing, Internet search, and advertising technologies corporation. Google hosts and develops a number of Internet-based services and products,[5] and generates profit primarily from advertising through its AdWords program"
  •  
    Description of Google.
mitch g

What is VPN? - A Word Definition From the Webopedia Computer Dictionary - 0 views

  • Home > VPN VPN (pronounced as separate letters) Short for virtual private network, a network that is constructed by using public wires to connect nodes. For example, there are a number of systems that enable you to create networks using the Internet as the medium for transporting data. These systems use encryption and other security mechanisms to ensure that only authorized users can access the network and that the data cannot be intercepted.
  •  
    Home > VPN VPN(pronounced as separate letters) Short for virtual private network, a network that is constructed by using public wires to connect nodes. For example, there are a number of systems that enable you to create networks using the Internet as the medium for transporting data. These systems use encryption and other security mechanisms to ensure that only authorized users can access the network and that the data cannot be intercepted.
tyler smith

uploading and downloading - 0 views

shared by tyler smith on 06 Oct 09 - Cached
  • to download means to receive data to a local system from a remote system, or to initiate such a data transer.
  • Examples of a remote system might from which a download might be performed include a webserver, FTP server, email server, or other similar systems. A download can mean either any file that is offered for downloading or that has been downloaded, the process of receiving such a file.
  • In contrast, the term downloading is distinguished from the related concept of streaming, which indicates the receiving of data that is used near immediately as it is received, while the transmission is still in progress and which may not be stored long-term, whereas in a process described using the term downloading, this would imply that the data is only usable when it has been received in its entirety.
  • ...2 more annotations...
  • The use of the terms uploading and downloading often imply that the data sent or received is to be stored permanenently, or at least stored more than temporarily.
  • When there is a transfer of data from a remote system to another remote system, the process is called "remote uploading".
  •  
    good source for the definitions
Kendall Butler

Outsource Insurance Claims Processing Services | Health Insurance Claims Process - 0 views

  •  
    " Advantages of our Insurance claims Processing Service: * Competitive Rates * Highly skilled trained experts * 24/7 Customer Support * Quick management and free of insurance claims * Skill to automate generation and archival process * Ability to improves the process of capturing, storing and extracting data on insurance claims * Reduces insurance claim processing costs Outsource data processing, offers a wide range of customized solutions for the insurance application, reviewing the insurance application, application processing, verification and validation applications, management customer services. We use a variety of savings plans for each sector to manage costs and better management of the insurance claim. Outsource Data Processing delivers effective claims processing services for entire health services provider at very cost effective insurance claims processing rates. Outsource your insurance claims processing requirements to us. Our insurance claims processing services provide innovative medical claims processing, outsourcing online electronic claims processing services at cost effective rates. "
Ralph C

A New Era of Transparency | 3 Round Stones - 0 views

  •  
    This is about the challenges that the government has with social media "All governments have large data challenges. They have become even more challenging in this period of fiscal austerity. The use of Linked Data techniques allows governments to publish more, reuse more and combine more data for a fraction of the cost of older methods. "
Mike tiani

Mobile technology - 1 views

  •  
    "Mobile technology is exactly what the name implies - technology that is portable. Examples of mobile IT devices include: laptop and netbook computers palmtop computers or personal digital assistants mobile phones and 'smart phones' global positioning system (GPS) devices wireless debit/credit card payment terminals Mobile devices can be enabled to use a variety of communications technologies such as: wireless fidelity (Wi-Fi) - a type of wireless local area network technology Bluetooth - connects mobile devices wirelessly 'third generation' (3G), global system for mobile communications (GSM) and general packet radio service (GPRS) data services - data networking services for mobile phones dial-up services - data networking services using modems and telephone lines virtual private networks - secure access to a private network It is therefore possible to network the mobile device to a home office or the internet while travelling. Benefits Mobile computing can improve the service you offer your customers. For example, when meeting with customers you could access your customer relationship management system - over the internet - allowing you to update customer details whilst away from the office. Alternatively, you can enable customers to pay for services or goods without having to go to the till. For example, by using a wireless payment terminal diners can pay for their meal without leaving their table. More powerful solutions can link you directly into the office network while working off site, for instance to access your database or accounting systems. For example, you could: set up a new customer's account check prices and stock availability place an order online This leads to great flexibility in working - for example, enabling home working, or working while travelling. Increasingly, networking 'hot spots' are being provided in public areas that allow connection back to the office network or the internet.
  •  
    gives examples of what types of products are mobile and the networks that they work on.
marlee mikol

The U.S. Government's Growing Appetite for Google Users' Data - 0 views

  •  
    This article is about how the Government and law enforcement demands that Google share user data are growing 25 percent every six months. They are asking Google to hand over data on its customers to help with investigations.
kimberly caise

The Atlantic Online | January/February 2010 | What Makes a Great Teacher? | Amanda Ripley - 0 views

  • This tale of two boys, and of the millions of kids just like them, embodies the most stunning finding to come out of education research in the past decade: more than any other variable in education—more than schools or curriculum—teachers matter. Put concretely, if Mr. Taylor’s student continued to learn at the same level for a few more years, his test scores would be no different from those of his more affluent peers in Northwest D.C. And if these two boys were to keep their respective teachers for three years, their lives would likely diverge forever. By high school, the compounded effects of the strong teacher—or the weak one—would become too great.
  • Farr was tasked with finding out. Starting in 2002, Teach for America began using student test-score progress data to put teachers into one of three categories: those who move their students one and a half or more years ahead in one year; those who achieve one to one and a half years of growth; and those who yield less than one year of gains. In the beginning, reliable data was hard to come by, and many teachers could not be put into any category. Moreover, the data could never capture the entire story of a teacher’s impact, Farr acknowledges.
  • They were also perpetually looking for ways to improve their effectiveness
  • ...12 more annotations...
  • First, great teachers tended to set big goals for their students.
  • Great teachers, he concluded, constantly reevaluate what they are doing.
  • Superstar teachers had four other tendencies in common: they avidly recruited students and their families into the process; they maintained focus, ensuring that everything they did contributed to student learning; they planned exhaustively and purposefully—for the next day or the year ahead—by working backward from the desired outcome; and they worked relentlessly, refusing to surrender to the combined menaces of poverty, bureaucracy, and budgetary shortfalls.
  • When her fourth-grade students entered her class last school year, 66 percent were scoring at or above grade level in reading. After a year in her class, only 44 percent scored at grade level, and none scored above. Her students performed worse than fourth-graders with similar incoming scores in other low-income D.C. schools. For decades, education researchers blamed kids and their home life for their failure to learn. Now, given the data coming out of classrooms like Mr. Taylor’s, those arguments are harder to take. Poverty matters enormously. But teachers all over the country are moving poor kids forward anyway, even as the class next door stagnates. “At the end of the day,” says Timothy Daly at the New Teacher Project, “it’s the mind-set that teachers need—a kind of relentless approach to the problem.”
  • are almost never dismissed.
  • What did predict success, interestingly, was a history of perseverance—not just an attitude, but a track record. In the interview process, Teach for America now asks applicants to talk about overcoming challenges in their lives—and ranks their perseverance based on their answers.
  • Gritty people, the theory goes, work harder and stay committed to their goals longer
  • This year, Teach for America allowed me to sit in on the part of the interview process that it calls the “sample teach,” in which applicants teach a lesson to the other applicants for exactly five minutes. Only about half of the candidates make it to this stage. On this day, the group includes three men and two women, all college seniors or very recent graduates.
  • But if school systems hired, trained, and rewarded teachers according to the principles Teach for America has identified, then teachers would not need to work so hard. They would be operating in a system designed in a radically different way—designed, that is, for success.
  • five observation sessions conducted throughout the year by their principal, assistant principal, and a group of master educators.
  • t year’s end, teachers who score below a certain threshold could be fired.
  • But this tradition may be coming to an end. He’s thinking about quitting in the next few years.
  •  
    "This tale of two boys, and of the millions of kids just like them, embodies the most stunning finding to come out of education research in the past decade: more than any other variable in education-more than schools or curriculum-teachers matter. Put concretely, if Mr. Taylor's student continued to learn at the same level for a few more years, his test scores would be no different from those of his more affluent peers in Northwest D.C. And if these two boys were to keep their respective teachers for three years, their lives would likely diverge forever. By high school, the compounded effects of the strong teacher-or the weak one-would become too great."
Thomas H

Home - MSc Mobile and Ubiquitous Computing : Trinity College Dublin - 0 views

  •  
    "Mobile computing allows people to make use of computing and information systems without being tied to a desktop computer located in their office, classroom, or home. People can now make use of computer systems while on the move, whether waiting for a flight in some airport departure lounge, drinking coffee in their favorite cafe, simply driving home, or even just walking down the street. Thanks to the improved portability and processing power of laptop computers, Personal Digital Assistants, and even mobile phones, as well as improved battery life and the near universal coverage of wireless data communications networks, mobile computer users can now make use of almost the same range of services as desktop users. While the use of current mobile computers often follows the traditional pattern of a single user interacting with their dedicated computer via its own display and keyboard, mobile computing is still at an early stage of development. In his seminal paper on the computer for the 21st century written in 1991†, Marc Weiser noted that "The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it." Weiser put forward a vision of future computer systems in which "computers themselves vanish into the background". In doing so he inspired a field of research known as ubiquitous computing . In the ubiquitous computing vision, interconnected (mobile) computers are embedded unobtrusively in everyday appliances and environments and co-operate to provide information and services on behalf of their users. The ubiquitous computing vision is now becoming a reality enabled by recent and expected developments in new sensor technologies - increasing the range of stimuli that can be effectively sensed, by wireless networking - allowing mobile computer systems to co-operate, by miniaturization of computational devices - allowing massive deployment of sensor-based systems in every
marlee mikol

Here's What Google Does When the Government Wants Your Emails - 0 views

  •  
    This article is about how Google stresses that it does everything in its power to keep user data private when it's legally able to do so. This information has been a secret for a while. If a request passes this stage, Google then considers whether the request is overly broad in scope. If Google decides the request is indeed too broad, it either denies the request or looks to narrow it. Google's standards for handing over user data are stronger than those required by American law
Lauren Skillinge

iOS Apps Can Be Hijacked to Show Fraudulent Content and Intercept Data - 1 views

  •  
    This article explains that some apps can be hijacked, causing them to display false information and intercept any personal data of the user.
Kyle Bambu

MicroStrategy revamps software for bigger data sets, faster visuals | PCWorld - 0 views

  •  
    This article talks about a company, MicroStrategy, that is expanding its software platforms and upgrading software to fit today's needs.
Brandon J

Uploading and downloading - Wikipedia, the free encyclopedia - 0 views

  • The inverse operation, uploading, can refer to the sending of data from a local system to a remote system such as a server or another client with the intent that the remote system should store a copy of the data being transferred, or the initiation of such a process. The words first came into popular usage among computer users with the increased popularity of Bulletin Board Systems (BBSs), facilitated by the widespread distribution and implementation of dial-up access the in the 1970s.
  • remote system, or to initiate such a data transfer. Examples of a remote system from which a download might be performed include a webserver, FTP server, email server, or other
  • nothing to do with the size of the systems involved (see Sideload below). A download can mean either any file that is offered for downloading or
  • ...4 more annotations...
  • installing or simply combine them incorrectly
  • meaning of downloading
  • mistake and confuse
  • become more
mitch g

What is workflow? - Definition from Whatis.com - 1 views

  • definition - Workflow is a term used to describe the tasks, procedural steps,
  • organizations or people involved, required input and output information, and tools needed for each step in a business process. A workflow approach to analyzing and managing a business process can be combined with an object-oriented programming approach, which tends to focus on documents and data. In general, workflow management focuses on processes rather than documents. A number of companies make workflow automation products that allow a company to create a workflow model and components such as online forms and then to use this product as a way to manage and enforce the consistent handling of work. For example, an insurance company could use a workflow automation application to ensure that a claim was handled consistently from initial call to final settlement. The workflow application would ensure that each person handling the claim used the correct online form and successfully completed their step before allowing the process to proceed to the next person and procedural step. A workflow engine is the component in a workflow automation program that knows all the procedures, steps in a procedure, and rules for each step. The workflow engine determines whether the process is ready to move to the next step. Some vendors sell workflow automation products for particular industries such as insurance and banking or for commonly-used processes such as handling computer service calls. Proponents of the workflow approach believe that task analysis and workflow modeling in themselves are likely to improve business operations.
  •  
    Workflow is a term used to describe the tasks, procedural steps, organizations or people involved, required input and output information, and tools needed for each step in a business process. A workflow approach to analyzing and managing a business process can be combined with an object-oriented programming approach, which tends to focus on documents and data. In general, workflow management focuses on processes rather than documents. A number of companies make workflow automation products that allow a company to create a workflow model and components such as online forms and then to use this product as a way to manage and enforce the consistent handling of work. For example, an insurance company could use a workflow automation application to ensure that a claim was handled consistently from initial call to final settlement. The workflow application would ensure that each person handling the claim used the correct online form and successfully completed their step before allowing the process to proceed to the next person and procedural step. A workflow engine is the component in a workflow automation program that knows all the procedures, steps in a procedure, and rules for each step. The workflow engine determines whether the process is ready to move to the next step. Some vendors sell workflow automation products for particular industries such as insurance and banking or for commonly-used processes such as handling computer service calls. Proponents of the workflow approach believe that task analysis and workflow modeling in themselves are likely to improve business operations.
mitch g

Workflow Software White Papers ( Workflow Automation, Work Flow Software, Workflow Mana... - 0 views

  • DEFINITION: Workflow is a term used to describe the tasks, procedural steps, organizations or people involved, required input and output information, and tools needed for each step in a business process. A workflow approach to analyzing and managing a business process can be combined with an object-oriented programming approach, which tends to focus on documents and data. In general, workflow management focuses  …  Definition continues below.
  •  
    DEFINITION: Workflow is a term used to describe the tasks, procedural steps, organizations or people involved, required input and output information, and tools needed for each step in a business process. A workflow approach to analyzing and managing a business process can be combined with an object-oriented programming approach, which tends to focus on documents and data. In general, workflow management focuses  …  Definition continues below.
mitch g

Mike McCue | CrunchBase Profile - 0 views

  • Mike McCue founded Tellme Networks in 1999 as the CEO. He joined Microsoft as the General Manager of the Tellme subsidiary after its acquisition. In 2000, Mike led Tellme to launch one of the world’s first Internet platforms to deliver web data to anyone over any telephone. Starting with simple Web services, Tellme’s innovative platform inspired the migration of large-scale phone services from proprietary applications to open standards applications and drove the global adoption of VoiceXML. Before founding Tellme, Mike worked at Netscape as Vice President of Technology. He joined Netscape after their successful acquisition of the first company he founded, Paper Software, a leader in 3-D browser technology. Mike was honored with a Kilby International Award as a Young Innovator for his work bringing 3-D technology to the world through Netscape’s Web browser.
  •  
    Mike McCue founded Tellme Networks in 1999 as the CEO. He joined Microsoft as the General Manager of the Tellme subsidiary after its acquisition. In 2000, Mike led Tellme to launch one of the world's first Internet platforms to deliver web data to anyone over any telephone. Starting with simple Web services, Tellme's innovative platform inspired the migration of large-scale phone services from proprietary applications to open standards applications and drove the global adoption of VoiceXML. Before founding Tellme, Mike worked at Netscape as Vice President of Technology. He joined Netscape after their successful acquisition of the first company he founded, Paper Software, a leader in 3-D browser technology. Mike was honored with a Kilby International Award as a Young Innovator for his work bringing 3-D technology to the world through Netscape's Web browser.
ooechs 0

Mobile Computing : Past , Present and Future - 0 views

  • Mobile Computing : A technology that allows transmission of data, via a computer, without having to be connected to a fixed physical link.
  • communications market
  • Today, the mobile data communications market is becoming dominated by a technology called CDPD.
  • ...3 more annotations...
  • Cellular Digital Packet Data (CDPD
  • Speed best
    • ooechs 0
       
      Difference between Mainframe, PC, and Ubiquitous computing. Don't know if this graph is accurate but would be useful if accurate
Kunjan P

National Center for Supercomputing Applications - Wikipedia, the free encyclopedia - 0 views

  • The National Center for Supercomputing Applications (NCSA) is a state-federal partnership to develop and deploy national-scale cyberinfrastructure that advances science and engineering. NCSA operates as a unit of the University of Illinois at Urbana-Champaign but it provides high-performance computing resources to researchers across the country. Support for NCSA comes from the National Science Foundation, the state of Illinois, the University of Illinois, business and industry partners, and other federal agencies.
  • These centers were founded when a group of University of Illinois faculty, led by Larry Smarr, sent an unsolicited proposal to the National Science Foundation in 1983. The foundation announced funding for the supercomputer centers in 1985; the first supercomputer at NCSA came online in January 1986.
  • NCSA provides leading-edge computing, data storage, and visualization resources. NCSA computational and data environment implements a multi-architecture hardware strategy, deploying both clusters and shared memory systems to support high-end users and communities on the architectures best-suited to their requirements. Nearly 1,360 scientists, engineers and students used the computing and data systems at NCSA to support research in more than 830 projects. A list of NCSA hardware is available at NCSA Capabilities
  • ...3 more annotations...
  • Today NCSA is collaborating with IBM, under a grant from the National Science Foundation, to build [1] "Blue Waters," a supercomputer capable of performing 1 quadrillion calculations per second, a measure known as a petaflop. Blue Waters is due to come online in 2011.
  • The Mosaic web browser, the first popular graphical Web browser which played an important part in expanding the growth of the World Wide Web, was written by Marc Andreessen and Eric Bina at NCSA. Andreessen and Bina went on to develop the Netscape Web browser. Mosaic was later licensed to Spyglass,_Inc. which provided the foundation for Internet Explorer.
  • Initially, NCSA's administrative offices were in the Water Resources Building and employees were scattered across the campus. NCSA is now headquartered within its own building directly north of the Siebel Center for Computer Science, on the site of a former baseball field, Illini Field. NCSA's supercomputers remain at the Advanced Computation Building, but construction is now under way on a Petascale Computing Facility to house Blue Waters.
  •  
    The NCSA is a great stepping stone to the evolution of Web 2.0.
Hayes G.

In Few Years, Social Network Data May Be Used in Underwriting - 0 views

  •  
    "In Few Years, Social Network Data May Be Used in Underwriting"
1 - 20 of 68 Next › Last »
Showing 20 items per page