Skip to main content

Home/ Computer Science Knowledge Sharing/ Group items tagged wikis

Rss Feed Group items tagged

Abdelrahman Ogail

Simulated annealing - Wikipedia, the free encyclopedia - 1 views

  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • ...1 more annotation...
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration - provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima-which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    A natural AI approach
Abdelrahman Ogail

Flocking (behavior) - Wikipedia, the free encyclopedia - 0 views

  • Flocking behavior is the behavior exhibited when a group of birds, called a flock, are foraging or in flight. There are parallels with the shoaling behavior of fish, or the swarming behavior of insects. Computer simulations and mathematical models which have been developed to emulate the flocking behaviors of birds can generally be applied also to the "flocking" behavior of other species. As a result, the term "flocking" is sometimes applied, in computer science, to species other than birds. This article is about the modelling of flocking behavior. From the perceptive of the mathematical modeller, "flocking" is the collective motion of a large number of self-propelled entities and is a collective animal behavior exhibited by many living beings such as birds, fish, bacteria, and insects.[1] It is considered an emergent behaviour arising from simple rules that are followed by individuals and does not involve any central coordination. Flocking behavior was first simulated on a computer in 1986 by Craig Reynolds with his simulation program, Boids. This program simulates simple agents (boids) that are allowed to move according to a set of basic rules. The result is akin to a flock of birds, a school of fish, or a swarm of insects.
  • Flocking behavior is the behavior exhibited when a group of birds, called a flock, are foraging or in flight. There are parallels with the shoaling behavior of fish, or the swarming behavior of insects. Computer simulations and mathematical models which have been developed to emulate the flocking behaviors of birds can generally be applied also to the "flocking" behavior of other species. As a result, the term "flocking" is sometimes applied, in computer science, to species other than birds. This article is about the modelling of flocking behavior. From the perceptive of the mathematical modeller, "flocking" is the collective motion of a large number of self-propelled entities and is a collective animal behavior exhibited by many living beings such as birds, fish, bacteria, and insects.[1] It is considered an emergent behaviour arising from simple rules that are followed by individuals and does not involve any central coordination. Flocking behavior was first simulated on a computer in 1986 by Craig Reynolds with his simulation program, Boids. This program simulates simple agents (boids) that are allowed to move according to a set of basic rules. The result is akin to a flock of birds, a school of fish, or a swarm of insects.
Abdelrahman Ogail

Clockwork universe theory - Wikipedia, the free encyclopedia - 1 views

  • The Clockwork Universe Theory is a theory, established by Isaac Newton, as to the origins of the universe. A "clockwork universe" can be thought of as being a clock wound up by God and ticking along, as a perfect machine, with its gears governed by the laws of physics. What sets this theory apart from others is the idea that God's only contribution to the universe was to set everything in motion, and from there the laws of science took hold and have governed every sequence of events since that time. This idea was very popular during the Enlightenment, when scientists realized that Newton's laws of motion, including the law of universal gravitation, could explain the behavior of the solar system. A notable exclusion from this theory though is free will, since all things have already been set in motion and are just parts of a predictable machine. Newton feared that this notion of "everything is predetermined" would lead to atheism. This theory was undermined by the second law of thermodynamics ( the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value) and quantum physics with its unpredictable random behavior.
  • The Clockwork Universe Theory is a theory, established by Isaac Newton, as to the origins of the universe. A "clockwork universe" can be thought of as being a clock wound up by God and ticking along, as a perfect machine, with its gears governed by the laws of physics. What sets this theory apart from others is the idea that God's only contribution to the universe was to set everything in motion, and from there the laws of science took hold and have governed every sequence of events since that time. This idea was very popular during the Enlightenment, when scientists realized that Newton's laws of motion, including the law of universal gravitation, could explain the behavior of the solar system. A notable exclusion from this theory though is free will, since all things have already been set in motion and are just parts of a predictable machine. Newton feared that this notion of "everything is predetermined" would lead to atheism. This theory was undermined by the second law of thermodynamics ( the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value) and quantum physics with its unpredictable random behavior.
    • Abdelrahman Ogail
       
      "God's only contribution to the universe was to set everything in motion, and from there the laws of science took hold and have governed every sequence of events since that time" <-- ???
Abdelrahman Ogail

Genetic algorithm - Wikipedia, the free encyclopedia - 0 views

  • A genetic algorithm (GA) is a search technique used in computing to find exact or approximate solutions to optimization and search problems. Genetic algorithms are categorized as global search heuristics. Genetic algorithms are a particular class of evolutionary algorithms (EA) that use techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover (also called recombination).
  • A typical genetic algorithm requires: a genetic representation of the solution domain, a fitness function to evaluate the solution domain.
  •  
    GE are primary used in Learning in AI
Abdelrahman Ogail

Genetic programming - Wikipedia, the free encyclopedia - 0 views

  • In artificial intelligence, genetic programming (GP) is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. Therefore it is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task.
  • In artificial intelligence, genetic programming (GP) is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. Therefore it is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task.
Islam TeCNo

OpenGL - Wikipedia, the free encyclopedia - 0 views

shared by Islam TeCNo on 10 Jun 09 - Cached
  • OpenGL (Open Graphics Library) is a standard specification defining a cross-language, cross-platform API for writing applications that produce 2D and 3D computer graphics. The interface consists of over 250 different function calls which can be used to draw complex three-dimensional scenes from simple primitives. OpenGL was developed by Silicon Graphics Inc. (SGI) in 1992[1] and is widely used in CAD, virtual reality, scientific visualization, information visualization, and flight simulation. It is also used in video games, where it competes with Direct3D on Microsoft Windows platforms (see Direct3D vs. OpenGL). OpenGL is managed by the non-profit technology consortium, the Khronos Group.
    • Mohamed Abd El Monem
       
      just some info about OGL :)
    • Islam TeCNo
       
      fe so2al bytra7 nafso !! ...ezaii el developers by3mlo API byshta3'l ma3 ay lo3'a !! ...ya3ni men el python OpenGL men el C++ OpenGL men el C# OpenGL!!
  • Mark Segal and Kurt Akeley authored the original OpenGL specification
    • Islam TeCNo
       
      2 Names to remember :D
    • Islam TeCNo
       
      LOL @ Book Names
  • ...2 more annotations...
  • (which actually has a white cover)
    • Islam TeCNo
       
      Realy LOL :D :D
  • The OpenGL standard allows individual vendors to provide additional functionality through extensions as new technology is created. Extensions may introduce new functions and new constants, and may relax or remove restrictions on existing OpenGL functions. Each vendor has an alphabetic abbreviation that is used in naming their new functions and constants. For example, NVIDIA's abbreviation (NV) is used in defining their proprietary function glCombinerParameterfvNV() and their constant GL_NORMAL_MAP_NV.
Abdelrahman Ogail

Production system - Wikipedia, the free encyclopedia - 0 views

  • A production system (or production rule system) is a computer program typically used to provide some form of artificial intelligence, which consists primarily of a set of rules about behavior. These rules, termed productions, are a basic representation found useful in AI planning, expert systems and action selection. A production system provides the mechanism necessary to execute productions in order to achieve some goal for the system. Productions consist of two parts: a sensory precondition (or "IF" statement) and an action (or "THEN"). If a production's precondition matches the current state of the world, then the production is said to be triggered. If a production's action is executed, it is said to have fired. A production system also contains a database, sometimes called working memory, which maintains data about current state or knowledge, and a rule interpreter. The rule interpreter must provide a mechanism for prioritizing productions when more than one is triggered.
  • A production system (or production rule system) is a computer program typically used to provide some form of artificial intelligence, which consists primarily of a set of rules about behavior. These rules, termed productions, are a basic representation found useful in AI planning, expert systems and action selection. A production system provides the mechanism necessary to execute productions in order to achieve some goal for the system. Productions consist of two parts: a sensory precondition (or "IF" statement) and an action (or "THEN"). If a production's precondition matches the current state of the world, then the production is said to be triggered. If a production's action is executed, it is said to have fired. A production system also contains a database, sometimes called working memory, which maintains data about current state or knowledge, and a rule interpreter. The rule interpreter must provide a mechanism for prioritizing productions when more than one is triggered.
Islam TeCNo

Rich Internet application - Wikipedia, the free encyclopedia - 0 views

  • Rich Internet applications (RIAs) are web applications that have some of the characteristics of desktop applications, typically delivered by way of a proprietary web browser plug-ins or independently via sandboxes or virtual machines[1]. Examples of RIA frameworks include Curl (programming language), Adobe Flash/Adobe Flex/AIR, Java/JavaFX[2], uniPaaS[3] and Microsoft Silverlight[4].
    • Abdelrahman Ogail
       
      RIA Application Definition. Seems to be applications that could be used to Web and Desktop Development as Java
    • Islam TeCNo
       
      (Y) .....ana bardo kont 3ayez a3ref eh RIA deh :D
  •  
    Very clear document :D .....need no explanation :D
Islam TeCNo

Uniform Resource Identifier - Wikipedia, the free encyclopedia - 0 views

shared by Islam TeCNo on 16 Jun 09 - Cached
  • In computing, a Uniform Resource Identifier (URI) consists of a string of characters used to identify or name a resource on the Internet. Such identification enables interaction with representations of the resource over a network (typically the World Wide Web) using specific protocols. Schemes specifying a specific syntax and associated protocols define each URI. Contents [hide]
    • Abdelrahman Ogail
       
      I've confused between URL & URI till reading this article !
    • Islam TeCNo
       
      URL no3 men el URI :D ....ana faker eno kont shoft el 7eta deh fe ketab 3an el HTTP bas nesetha .......Zanks Zikas Again
  • A Uniform Resource Name (URN) functions like a person's name, while a Uniform Resource Locator (URL) resembles that person's street address. The URN defines an item's identity, while the URL provides a method for finding it. The ISBN system for uniquely identifying books provides a typical example of the use of typical URNs. ISBN 0486275574 (urn:isbn:0-486-27557-4) cites unambiguously a specific edition of Shakespeare's play Romeo and Juliet. In order to gain access to this object and read the book, one would need its location: a URL address. A typical URL for this book on a unix-like operating system might look like the file path file:///home/username/RomeoAndJuliet.pdf, identifying the electronic book saved in a local hard disk. So URNs and URLs have complementary purposes.
Abdelrahman Ogail

Artificial life - Wikipedia, the free encyclopedia - 2 views

  • Artificial life (commonly Alife or alife) is a field of study and an associated art form which examine systems related to life, its processes, and its evolution through simulations using computer models, robotics, and biochemistry.[1] There are three main kinds of alife[2], named for their approaches: soft[3], from software; hard[4], from hardware; and wet, from biochemistry. Artificial life imitates traditional biology by trying to recreate biological phenomena.[5] The term "artificial life" is often used to specifically refer to soft alife
  • The modeling philosophy of alife strongly differs from traditional modeling, by studying not only “life-as-we-know-it”, but also “life-as-it-might-be” [7].
Abdelrahman Ogail

Common Gateway Interface - Wikipedia, the free encyclopedia - 0 views

  • The Common Gateway Interface (CGI) is a standard protocol for interfacing external application software with an information server, commonly a web server. The task of such an information server is to respond to requests (in the case of web servers, requests from client web browsers) by returning output. Each time a request is received, the server analyzes what the request asks for, and returns the appropriate output. The two basic methods for the server to do this are the following: If the request identifies a file stored on disk, then return the contents of that file. If the request identifies an executable command and possibly arguments, then run the command and return its output. CGI defines a standard way of doing the second. It defines how information about the server and the request is passed to the command in the form of arguments and environment variables, and how the command can pass back extra information about the output (such as the type) in the form of headers.
Islam TeCNo

Database - Wikipedia, the free encyclopedia - 0 views

shared by Islam TeCNo on 08 Jun 09 - Cached
  • A database is a structured collection of records or data that is stored in a computer system. The structure is achieved by organizing the data according to a database model. The model in most common use today is the relational model. Other models such as the hierarchical model and the network model use a more explicit representation of relationships
    • Abdelrahman Ogail
       
      Database official definition
    • Islam TeCNo
       
      yes .... bas a3taked en el wa7ed yfham ahm b keter men eno ye3ref el Definition (dah mogarad test post hehe )
    • Islam TeCNo
       
      But in File Stucter we took that database is set of related files
  • increase their speed
  • common kind of index is a sorted list of the contents of some particular table column, with pointers to the row associated with the value
  • ...1 more annotation...
  • Typically, indexes are also stored in the various forms of data-structure mentioned above (such as B-trees, hashes, and linked lists)
Islam TeCNo

List of network protocols - Wikipedia, the free encyclopedia - 0 views

  • IPv4 Internet Protocol version 4
    • Islam TeCNo
       
      el mafroud dah eli howa asasn e7na bensta5dmo now ...maslan el IP beta3i 217.154.89.2 ....el mafroud dah IPv4 we el mafroud ba2a ba3den netawr we neb2a IPv6 :D
  • IPv6 Internet Protocol version 6
    • Islam TeCNo
       
      المفروض ده بقي البروتوكول الجديد الي إن شاء الله هيطبق بعيدن
    • Ahmed Mansour
       
      Currently there are two types of Internet Protocol (IP) addresses in active use: IP version 4 (IPv4) and IP version 6 (IPv6). IPv4 was initially deployed on 1 January 1983 and is still the most commonly used version. IPv4 addresses are 32-bit numbers often expressed as 4 octets in "dotted decimal" notation (for example, 192.0.2.53). Deployment of the IPv6 protocol began in 1999. IPv6 addresses are 128-bit numbers and are conventionally expressed using hexadecimal strings (for example, 2001:0db8:582:ae33::29). source : http://www.iana.org/numbers/ thanks tecno for your useful comments :) keep it up(Y)
  • TCP Transmission Control Protocol UDP User Datagram Protocol
    • Islam TeCNo
       
      The Diffrance between TCP and UDP is that TCP check the packet that packets are sent correctly ... but UDP just send and don't check ....so UDP is faster than TCP but TCP check that data is sent correctly .... so UDP is used in applications like Video Streaming (You Tube) and Voice Streaming too
    • Islam TeCNo
  • ...6 more annotations...
  • Jabber, an instant-messaging protocol
    • Islam TeCNo
       
      Great protcol .....also Called XMPP and this is the protocol used by Google Talk
    • Islam TeCNo
       
      port 5222 i think is it's default :D
  • Internet Relay Chat (IRC)
    • Islam TeCNo
       
      I think this is a very old chat protocol
  • ED2K, A peer-to-peer file sharing protocol
    • Islam TeCNo
       
      I think this is used by emule P2P program !! A Great P2P File Sharing program
  • FTP, File Transfer Protocol
    • Islam TeCNo
       
      use port 21 :D
    • Islam TeCNo
       
      loooooooot of protocols ....any one know simple explainations ?
  • HTTP, HyperText Transfer Protocol
Islam TeCNo

Model-view-controller - Wikipedia, the free encyclopedia - 0 views

  • Model–view–controller (MVC) is an architectural pattern used in software engineering. Successful use of the pattern isolates business logic from user interface considerations, resulting in an application where it is easier to modify either the visual appearance of the application or the underlying business rules without affecting the other. In MVC, the model represents the information (the data) of the application; the view corresponds to elements of the user interface such as text, checkbox items, and so forth; and the controller manages the communication of data and the business rules used to manipulate the data to and from the model.
    • Abdelrahman Ogail
       
      MVC one of the important patterns used at any software. Especially in Web Development, Database Systems and sure in Game Development
    • Islam TeCNo
       
      please ya zikas 7ot more comments l eni mesh fahem awi ...ana eli fahmo eni afsl el GUI 3an el core code
  • MVC is often seen in web applications, where the view is the actual HTML or XHTML page, and the controller is the code that gathers dynamic data and generates the content within the HTML or XHTML. Finally, the model is represented by the actual content, which is often stored in a database or in XML nodes, and the business rules that transform that content based on user actions.
    • Islam TeCNo
       
      i think this is like PHP or ASP page .... you just See HTML (view) that is Generated by PHP/ASP Code (controller) that gather data from Database (content)
Islam TeCNo

BBCode - Wikipedia, the free encyclopedia - 0 views

shared by Islam TeCNo on 26 Jun 09 - Cached
  • Bulletin Board Code or BBCode is a lightweight markup language used to format posts in many message boards. The available tags are usually indicated by square brackets surrounding a keyword, and they are parsed by the message board system before being translated into a markup language that web browsers understand—usually HTML or XHTML.
    • Islam TeCNo
       
      Simple :D
Abdelrahman Ogail

SOAP - Wikipedia, the free encyclopedia - 0 views

  • SOAP, originally defined as Simple Object Access Protocol, is a protocol specification for exchanging structured information in the implementation of Web Services in computer networks. It relies on Extensible Markup Language (XML) as its message format, and usually relies on other Application Layer protocols (most notably Remote Procedure Call (RPC) and HTTP) for message negotiation and transmission. SOAP can form the foundation layer of a web services protocol stack, providing a basic messaging framework upon which web services can be built.
Islam TeCNo

Deep Blue (chess computer) - Wikipedia, the free encyclopedia - 0 views

  • Deep Blue was a chess-playing computer developed by IBM. On May 11, 1997, the machine won a six-game match by two wins to one with three draws against world champion Garry Kasparov.[1] Kasparov accused IBM of cheating and demanded a rematch, but IBM declined and dismantled Deep Blue.[2] Kasparov had beaten a previous version of Deep Blue in 1996
    • Abdelrahman Ogail
       
      When AI beats humanity!
  • Deep Blue was then heavily upgraded (unofficially nicknamed "Deeper Blue")[11] and played Kasparov again in May 1997, winning the six-game rematch 3½–2½, ending on May 11, finally ending in game six, and becoming the first computer system to defeat a reigning world champion in a match under standard chess tournament time controls.
  • The system derived its playing strength mainly out of brute force computing power.
    • Islam TeCNo
       
      Dah eli bysamoh brute force men no3 el 7aywan :D
Abdelrahman Ogail

Voice over Internet Protocol - Wikipedia, the free encyclopedia - 0 views

shared by Abdelrahman Ogail on 01 Jul 09 - Cached
Ahmed One liked it
  • Voice over Internet Protocol (VoIP) is a general term for a family of transmission technologies for delivery of voice communications over IP networks such as the Internet or other packet-switched network
  • Internet telephony refers to communications services—voice, facsimile, and/or voice-messaging applications—that are transported via the Internet, rather than the public switched telephone network (PSTN). The basic steps involved in originating an Internet telephone call are conversion of the analog voice signal to digital format and compression/translation of the signal into Internet protocol (IP) packets for transmission over the Internet; the process is reversed at the receiving end.[1]
Islam TeCNo

LOL - Wikipedia, the free encyclopedia - 0 views

shared by Islam TeCNo on 25 Jun 09 - Cached
  • OL (also written with some or all letters lowercase) is an abbreviation for laughing out loud[1][2] or laugh out loud.[3] LOL is a common element of Internet slang used historically on Usenet, but now widespread in other forms of computer-mediated communication, and even face-to-face communication. It is one of many initialisms for expressing bodily reactions, in particular laughter, as text, including initialisms such as ROTFL[4][5][6][7] or ROFL [8] ("roll(ing) on the floor laughing"), a more emphatic expression of laughter, and BWL ("bursting with laughter"), above which there is "no greater compliment" according to technology columnist Larry Magid.[9] Other unrelated expansions include the now mostly historical "lots of luck" or "lots of love" used in letter-writing.[10
    • Abdelrahman Ogail
       
      Source of the LOL
    • Islam TeCNo
       
      hehe LOL :D
  • Corruptions of "LOL"
    • Abdelrahman Ogail
       
      This is a big LOL
Abdelrahman Ogail

ELIZA - Wikipedia, the free encyclopedia - 0 views

  • ELIZA was a computer program and an early example (by modern standards) of primitive natural language processing. ELIZA operated by processing users' responses to scripts, the most famous of which was DOCTOR, a simulation of a Rogerian psychotherapist. In this mode, ELIZA mostly rephrased the user's statements as questions and posed those to the 'patient.' ELIZA was written by Joseph Weizenbaum between 1964 to 1966
1 - 20 of 47 Next › Last »
Showing 20 items per page