Skip to main content

Home/ Computer Science Knowledge Sharing/ Group items tagged and

Rss Feed Group items tagged

Abdelrahman Ogail

Simulated annealing - Wikipedia, the free encyclopedia - 1 views

  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • ...1 more annotation...
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration - provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima-which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    A natural AI approach
computersciencej

TCP/IP model questions based Study Material for gate Computer Science - 0 views

  •  
    TCP/IP protocol based questions for gate computer science exam Q1.What is the difference between transport and session layer of OSI model. Answer: OSI Model Transport Layer The transport layer uses the services provided by the network layer, such as best path selection and logical addressing, to provide end-to-end communication between source and destination. • The transport -layer data stream is a logical connection between the endpoints of a network. • End-to-end control is provided by sliding windows and reliability in sequencing numbers and acknowledgments. The transport layer regulates information flow to ensure end-to-end connectivity between host applications reliably and accurately. • The TCP/ IP protocol of Layer 4 (t transport t layer ) has two protocols. They are TCP and UDP. The transport layer accepts data from the session layer and segments the data for transport across the network. Generally, the transport layer is responsible for making sure that the data is delivered error-free and in the proper sequence. Flow control generally occurs at the transport layer. OSI Model Session Layer The session layer establishes, manages, and terminates communication sessions. Communication sessions consist of service requests and service responses that occur between applications located on different network devices. These requests and responses are coordinated by protocols implemented at the session layer. The session layer establishes, manages, and terminates sessions between applications Functions of the session layer and the different processes that occur as data packets travel through this layer. More specifically, you learned that Communication sessions consist of mini-conversations that occur between applications located on different network devices. Requests and responses are coordinated by protocols implemented at the session layer. • The session layer decides whether to use two-way simultaneous communication or two-way alternate communicati
Abdelrahman Ogail

CodeProject: C# vs C/C++ Performance. Free source code and programming help - 0 views

  • is compiled twice. Once while the program is written and second when the program is executed at the user's site. The first compilation is done by your C# builder and the second by the .NET Framework on the user's machine. The reason why C# compiled applications could be faster is that, during the second compilation, the compiler knows the actual run-time environment and processor type and could generate instructions that targets a specific processor.
  • A well designed C# program is more than 90% as fast as an equivalent "well-designed" C++ program
  • The problem with "not-freeing" the memory at the right time is that the working set of the application increases which increases the number of "page faults"
  • ...1 more annotation...
  • That's a nice question. Except for writing time-critical blocks of code, prefer C#. Write all your algorithmic code in C++ (not VC++ .NET), compile it into a dll and call that using a Dll Interop through C#. This should balance the performance. This technique is not new or not invented by me or anyone. It's similar the old age C programming vs Assembly, where people on one camp fight assembly programming is faster and the other camp stating C is easier to develop and then people started using assembly embedded within a C program for time-critical applications using an asm block.
  •  
    C# is compiled twice. Once while the program is written and second when the program is executed at the user's site. The first compilation is done by your C# builder and the second by the .NET Framework on the user's machine. The reason why C# compiled applications could be faster is that, during the second compilation, the compiler knows the actual run-time environment and processor type and could generate instructions that targets a specific processor
Janos Haits

About SoLAR | Society for Learning Analytics Research (SoLAR) - 0 views

  •  
    "The Society for Learning Analytics Research (SoLAR) is an inter-disciplinary network of leading international researchers who are exploring the role and impact of analytics on teaching, learning, training and development. SoLAR has been active in organizing the International Conference on Learning Analytics & Knowledge (LAK) and the Learning Analytics Summer Institute (LASI), launching multiple initiatives to support collaborative and open research around learning analytics, promoting the publication and dissemination of learning analytics research, and advising and consulting with state, provincial, and national governments."
Janos Haits

Society for Learning Analytics Research (SoLAR) - 0 views

  •  
    "The Society for Learning Analytics Research (SoLAR) is an inter-disciplinary network of leading international researchers who are exploring the role and impact of analytics on teaching, learning, training and development. SoLAR has been active in organizing the International Conference on Learning Analytics & Knowledge (LAK) and the Learning Analytics Summer Institute (LASI), launching multiple initiatives to support collaborative and open research around learning analytics, promoting the publication and dissemination of learning analytics research, and advising and consulting with state, provincial, and national governments."
veera90

Best Pharmacovigilance Services | Pharmacovigilance Professionals | ACL Digital Life Sc... - 0 views

  •  
    From proof-of-concept to post-marketing surveillance services, you can depend on our PV experts to efficiently work through the entire scope of Pharmacovigilance activities. We offer a high level of expertise and assist in meeting the highest standards of applicable national and global regulations. The experts at ACL Digital can easily customize safety monitoring services to suit your specific business requirements. Most biopharmaceutical companies have distinct and demanding clinical safety requirements as per the directions of regulatory agencies. You can depend on our PV specialists who plan safety and pharmacovigilance services accordingly that fit the needs of both your product and study - we adhere and adapt to your processes and are flexible enough to do it the right way.
veera90

Key Home Automation Technologies that Shape the Industry - 0 views

  •  
    Business enterprises and companies with effective automation endeavors make it a strategic priority to augment employees and customer experience, and they use automation to do so. Just pursuing automation technologies to lower expenses is not adequate. To achieve success while scaling new and latest technologies, organizations should focus on the individuals involved (both employees and customers), commit to revamping current processes, and create new skill sets within the workforce to follow more distinct and innovative working methods.
computersciencej

Difference Between File Transfer Protocol and Hyper Text Transfer Protocol - 0 views

  •  
    In this post under Computer Science Study Material for Gate, we are going to tell the differences between File Transfer Protocol (FTP) and Hypertext Transfer Protocol (HTTP). File Transfer Protocol FTP and HTTP both were developed to make Internet transmission better. FTP is used to exchange files between computer accounts, to transfer files between an account and a desktop computer (upload), or to access software archives on the Internet. It 's also commonly used to download programs and other files to your computer from other servers. It transfers files in two different formats ASCII for text files and Binary format for binary files. To Read full Article click on folowing link http://www.computersciencejunction.in/2017/11/differences-between-ftp-and-http.html
Abdelrahman Ogail

Flocking (behavior) - Wikipedia, the free encyclopedia - 0 views

  • Flocking behavior is the behavior exhibited when a group of birds, called a flock, are foraging or in flight. There are parallels with the shoaling behavior of fish, or the swarming behavior of insects. Computer simulations and mathematical models which have been developed to emulate the flocking behaviors of birds can generally be applied also to the "flocking" behavior of other species. As a result, the term "flocking" is sometimes applied, in computer science, to species other than birds. This article is about the modelling of flocking behavior. From the perceptive of the mathematical modeller, "flocking" is the collective motion of a large number of self-propelled entities and is a collective animal behavior exhibited by many living beings such as birds, fish, bacteria, and insects.[1] It is considered an emergent behaviour arising from simple rules that are followed by individuals and does not involve any central coordination. Flocking behavior was first simulated on a computer in 1986 by Craig Reynolds with his simulation program, Boids. This program simulates simple agents (boids) that are allowed to move according to a set of basic rules. The result is akin to a flock of birds, a school of fish, or a swarm of insects.
  • Flocking behavior is the behavior exhibited when a group of birds, called a flock, are foraging or in flight. There are parallels with the shoaling behavior of fish, or the swarming behavior of insects. Computer simulations and mathematical models which have been developed to emulate the flocking behaviors of birds can generally be applied also to the "flocking" behavior of other species. As a result, the term "flocking" is sometimes applied, in computer science, to species other than birds. This article is about the modelling of flocking behavior. From the perceptive of the mathematical modeller, "flocking" is the collective motion of a large number of self-propelled entities and is a collective animal behavior exhibited by many living beings such as birds, fish, bacteria, and insects.[1] It is considered an emergent behaviour arising from simple rules that are followed by individuals and does not involve any central coordination. Flocking behavior was first simulated on a computer in 1986 by Craig Reynolds with his simulation program, Boids. This program simulates simple agents (boids) that are allowed to move according to a set of basic rules. The result is akin to a flock of birds, a school of fish, or a swarm of insects.
Abdelrahman Ogail

Clockwork universe theory - Wikipedia, the free encyclopedia - 1 views

  • The Clockwork Universe Theory is a theory, established by Isaac Newton, as to the origins of the universe. A "clockwork universe" can be thought of as being a clock wound up by God and ticking along, as a perfect machine, with its gears governed by the laws of physics. What sets this theory apart from others is the idea that God's only contribution to the universe was to set everything in motion, and from there the laws of science took hold and have governed every sequence of events since that time. This idea was very popular during the Enlightenment, when scientists realized that Newton's laws of motion, including the law of universal gravitation, could explain the behavior of the solar system. A notable exclusion from this theory though is free will, since all things have already been set in motion and are just parts of a predictable machine. Newton feared that this notion of "everything is predetermined" would lead to atheism. This theory was undermined by the second law of thermodynamics ( the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value) and quantum physics with its unpredictable random behavior.
  • The Clockwork Universe Theory is a theory, established by Isaac Newton, as to the origins of the universe. A "clockwork universe" can be thought of as being a clock wound up by God and ticking along, as a perfect machine, with its gears governed by the laws of physics. What sets this theory apart from others is the idea that God's only contribution to the universe was to set everything in motion, and from there the laws of science took hold and have governed every sequence of events since that time. This idea was very popular during the Enlightenment, when scientists realized that Newton's laws of motion, including the law of universal gravitation, could explain the behavior of the solar system. A notable exclusion from this theory though is free will, since all things have already been set in motion and are just parts of a predictable machine. Newton feared that this notion of "everything is predetermined" would lead to atheism. This theory was undermined by the second law of thermodynamics ( the total entropy of any isolated thermodynamic system tends to increase over time, approaching a maximum value) and quantum physics with its unpredictable random behavior.
    • Abdelrahman Ogail
       
      "God's only contribution to the universe was to set everything in motion, and from there the laws of science took hold and have governed every sequence of events since that time" <-- ???
Janos Haits

e-LICO Front Page | Data Mining Portal - 0 views

  •  
    "The goal of the e-LICO project is to build a virtual laboratory for interdisciplinary collaborative research in data mining and data-intensive sciences. The e-lab comprises three layers: the e-science and data mining layers form a generic research environment that can be adapted to different scientific domains by customizing the application layer. e-LICO uses both Taverna and RapidAnalytics/RapidMiner to design and enact data analysis workflows. It provides a variety of general-purpose and application-specific services and a broad toolkit to assist the user in designing such workflows."
Janos Haits

Magenta - 0 views

  •  
    "Magenta is a Google Brain project to ask and answer the questions, "Can we use machine learning to create compelling art and music? If so, how? If not, why not?" Our work is done in TensorFlow, and we regularly release our models and tools in open source. These are accompanied by demos, tutorial blog postings and technical papers. To follow our progress, watch our GitHub and join our discussion group."
veera90

Expert Biostatistics Services | Biostatistics | ACL Digital Life Sciences | IT Consulting - 0 views

  •  
    Biostatistics plays a vital role in clinical research. From protocol development and clinical trial designs to sample size calculation, data analysis and more, our team of Biostatisticians have the right SME expertise in multiple therapeutic areas to help deliver quality outcomes quickly and efficiently. You can rely on us to determine and apply the appropriate statistical model, write CSR sections to interact with regulatory authorities, and examine the efficacy of safety data. Innovative and insight-driven statistical methods play a crucial role in every step of the drug development process. At ACL Digital, biostatistics remains an integral part of our services.
Abdelrahman Ogail

Theory of mind - Wikipedia, the free encyclopedia - 0 views

  • Theory of mind is the ability to attribute mental states—beliefs, intents, desires, pretending, knowledge, etc.—to oneself and others and to understand that others have beliefs, desires and intentions that are different from one's own.[1]
  • One of the most important milestones in theory of mind development is gaining the ability to attribute false belief: that is, to recognize that others can have beliefs about the world that are wrong. To do this, it is suggested, one must understand how knowledge is formed, that people’s beliefs are based on their knowledge, that mental states can differ from reality, and that people’s behavior can be predicted by their mental states. Numerous versions of the false-belief task have been developed, based on the initial task done by Wimmer and Perner (1983).
  • In the most common version of the false-belief task (often called the ‘Sally-Anne’ task), children are told or shown a story involving two characters. For example, in one version, the child is shown two dolls, Sally and Anne, playing with a marble. The dolls put away the marble in a box, and then Sally leaves. Anne takes the marble out and plays with it again, and after she is done, puts it away in a different box. Sally returns and the child is then asked where Sally will look for the marble. The child passes the task if she answers that Sally will look in the first box where she put the marble; the child fails the task if she answers that Sally will look in the second box, where the child knows the marble is hidden, even though Sally cannot know, since she did not see it hidden there. In order to pass the task, the child must be able to understand that another’s mental representation of the situation is different from their own, and the child must be able to predict behavior based on that understanding. The results of research using false-belief tasks have been fairly consistent: most normally-developing children are unable to pass the tasks until around the age of three or four.
    • Abdelrahman Ogail
       
      Test your small brother this test if he/she under 3 years!
Janos Haits

Quantum UChicago.edu - 0 views

  •  
    'The Chicago Quantum Exchange (CQE) is an intellectual hub and partnership for advancing academic and industrial efforts in the science and engineering of quantum information. Members of CQE are focused on developing new ways of understanding and exploiting the laws of quantum mechanics, the fundamental yet counterintuitive theory that governs nature at its smallest scales. The overarching goal is to apply research innovations to develop radically new types of devices, materials, and computing techniques.'
Janos Haits

Blockstack - The New Decentralized Internet - 0 views

  •  
    'Blockstack is an open-source project and a decentralized network. For the past years, one company, Blockstack Inc, has taken the lead on protocol development. We plan to have many independent individuals and companies operating on the network and taking on greater roles in the protocol's development. We will release more details on governance structures and potential independent entities that can provide a degree of neutrality and balance to the protocol development in the long run.'
Janos Haits

The Freenet Project - /index - 0 views

  •  
    Freenet is free software which lets you anonymously share files, browse and publish "freesites" (web sites accessible only through Freenet) and chat on forums, without fear of censorship. Freenet is decentralised to make it less vulnerable to attack, and if used in "darknet" mode, where users only connect to their friends, is very difficult to detect. Share files, chat on forums, browse and publish, anonymously and without fear of blocking or censorship! Then connect to your friends for even better security! "
Janos Haits

WireGuard: fast, modern, secure VPN tunnel - 0 views

  •  
    "WireGuard® is an extremely simple yet fast and modern VPN that utilizes state-of-the-art cryptography. It aims to be faster, simpler, leaner, and more useful than IPsec, while avoiding the massive headache. It intends to be considerably more performant than OpenVPN. WireGuard is designed as a general purpose VPN for running on embedded interfaces and super computers alike, fit for many different circumstances. Initially released for the Linux kernel, it is now cross-platform (Windows, macOS, BSD, iOS, Android) and widely deployable. It is currently under heavy development, but already it might be regarded as the most secure, easiest to use, and simplest VPN solution in the industry."
Janos Haits

Main Page - Time Machine - 0 views

  •  
    "Explore simultaneously in space and time with Time Machine Each Time Machine on this page captures a process in extreme detail over space and time, with billions of pixels of explorable resolution. Choose a time machine and zoom into the image while traveling backwards or forwards through time. Select a Time Warp and the time machine's authors will take you on a guided space-time tour with text annotations explaining what you are viewing. You can even learn how to create your own Time Machines and Warps."
1 - 20 of 282 Next › Last »
Showing 20 items per page