Skip to main content

Home/ Computer Science Knowledge Sharing/ Group items tagged find

Rss Feed Group items tagged

6More

Simulated annealing - Wikipedia, the free encyclopedia - 1 views

  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • ...1 more annotation...
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration - provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima-which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    A natural AI approach
1More

Citavi - Reference Management and Knowledge Organization - 0 views

  •  
    "Search databases and library catalogs directly from within Citavi. Save results to your project with a click. Surf and save: when you find a book, article, or webpage online, use the Picker to quickly add its information to Citavi. Save copies of webpages as PDFs. Find and save all available PDF full text articles in Citavi. Everything in one place and always at hand."
1More

Safety Scanner - Windows Defender Security Intelligence - 0 views

  •  
    "Microsoft Safety Scanner is a scan tool designed to find and remove malware from Windows computers. Simply download it and run a scan to find malware and try to reverse changes made by identified threats."

Miximising Profits While Keeping Costs Low - 3 views

started by bar software on 06 Mar 12 no follow-up yet
2More

Genetic programming - Wikipedia, the free encyclopedia - 0 views

  • In artificial intelligence, genetic programming (GP) is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. Therefore it is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task.
  • In artificial intelligence, genetic programming (GP) is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. Therefore it is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task.
1More

Home | Open Data Portal - 0 views

  •  
    "The European Union Open Data Portal (EU ODP) gives you access to open data published by EU institutions and bodies. All the data you can find via this catalogue are free to use and reuse for commercial or non-commercial purposes."
1More

Freedom to Tinker - Research and expert commentary on digital technologies in public life - 0 views

  •  
    "Freedom to Tinker is hosted by Princeton's Center for Information Technology Policy, a research center that studies digital technologies in public life. Here you'll find comment and analysis from the digital frontier, written by the Center's faculty, students, and friends."
1More

discuss.ipfs.io - 0 views

  •  
    "These forums are the main place to ask questions, share information and find like-minded people who are using IPFS, libp2p, multiformats, orbit, orbit-db, IPLD, or any of the other libraries, tools and protocols created … read more"
1More

Knil | The best way to search and find when you're mobile - 0 views

  •  
    "Knil structures the information of the web into actionable knowledge"
1More

Lunyr - 0 views

  •  
    "Lunyr is an Ethereum-based decentralized crowdsourced encyclopedia which rewards users with app tokens for peer-reviewing and contributing information. We aim to be the starting point of the internet for finding reliable, accurate information. Our long-term vision is to develop a knowledge base API that developers can use to create next generation decentralized applications in Artificial Intelligence, Virtual Reality, Augmented Reality, and more."
1More

EDRi - Defending rights and freedoms online - 0 views

  •  
    'European Digital Rights (EDRi) is an association of civil and human rights organisations from across Europe. We defend rights and freedoms in the digital environment. You can find our members here.'
1More

searchcode | source code search engine - 0 views

  •  
    'Type in anything you want to find and you will be presented with the results that match with the relevant lines highlighted. Searches can filtered down using the filter panel. Some suggested search terms,'
1More

IPFS Distributions - 0 views

  •  
    "This is the downloads website for all the official software distributions of the IPFS Project. You can find all the apps, binaries, and packages here. Every distribution has a section on this page with … the distribution name and a short description the current version number and release date"
1More

Quantiki | Quantum Information Portal and Wiki - 1 views

  •  
    "The world's leading portal for everyone involved in quantum information science. No matter if you are a researcher, a student or an enthusiast of quantum theory, this is the place you are going to find useful and enjoyable! While here on Quantiki you can: browse our content, including fascinating and educative articles, then create your own account and log in to gain more editorial possibilities."
1More

System Pro | Search Reinvented for Research™ - 0 views

  •  
    "Search reinvented for research™ Meet System Pro The fastest and most reliable way to find, synthesize, and contextualize scientific research - starting in health and life sciences."
1More

Elicit: The AI Research Assistant - 0 views

  •  
    "Analyze research papers at superhuman speed Automate time-consuming research tasks like summarizing papers, extracting data, and synthesizing your findings."
1More

Find the Best Ai Tools Directory | Feature AI Tools - 0 views

  •  
    "Feature Ai Tools Top AI Tools Directory Discover all Top AI Tools for free in one website. With daily updates, stay up to date with new era of technology and unlock the potential of AI in your fingertips."
1More

FindMyAITool - List of AI Tools - 0 views

  •  
    "Discover AI Tools for Your Business! Streamline Your Workflow with Our List of AI tools. Find Your Perfect Solution."

Top-Notch Computer Tech Support Service - 1 views

started by shalani mujer on 10 Nov 11 no follow-up yet
15More

Common Mistakes in Online and Real-time Contests - 0 views

  • Dynamic programming problems are to be solved with tabular methods
    • Ahmed Mansour
       
      Dynamic programming, like the divide-and-conquer method, solves problems by combining the solutions to subproblems. ("Programming" in this context refers to a tabular method, not to writing computer code) y3ney 3bara 3n 2nene bn2sem el problem el kbirr le shwit probelsm so3'ira .. we ne solve el problems deh we ngma el yab2a dh 7l lel problem el kbira :D;d see introduction to algorithms book . chapter 15
  • breadth-first search
    • Ahmed Mansour
       
      In graph theory, breadth-first search (BFS) is a graph search algorithm that begins at the root node and explores all the neighboring nodes. Then for each of those nearest nodes, it explores their unexplored neighbor nodes, and so on, until it finds the goal. ya3ney be el 3arby keda lw ana 3ndy tree maslan we el tree dh bettkwen mn shwit levels 3ady gedan.. lama hagey 23mel search 3la node mo3ina fi el tree deh hamsk el tree mn el root bet3ha ely hwa level 0 we habda2 2mshy level by level y3ney hanzl 3la el level 1 we hakaz le 3'it mal2y el node bet3ty ,,,, see this ,, it's a tutorial show how BFS algorithm is work!! http://www.personal.kent.edu/~rmuhamma/Algorithms/MyAlgorithms/GraphAlgor/breadthSearch.htm
  • Memorize the value of pi You should always try to remember the value of pi as far as possible, 3.1415926535897932384626433832795, certainly the part in italics. The judges may not give the value in the question, and if you use values like 22/7 or 3.1416 or 3.142857, then it is very likely that some of the critical judge inputs will cause you to get the wrong answer. You can also get the value of pi as a compiler-defined constant or from the following code: Pi=2*acos(0)
    • Islam TeCNo
       
      hhhhhhhhhhh ...... awl mara a3rf el mawdo3 dah we awl mara a3raf en el Pi = 2*acos(0)
    • Abdelrahman Ogail
       
      Thanks Islam for the info, really useful
  • ...4 more annotations...
  • You cannot always check the equality of floating point numbers with the = = operator in C/C++. Logically their values may be same, but due to precision limit and rounding errors they may differ by some small amount and may be incorrectly deemed unequal by your program
  • #define swap(xxx, yyy) (xxx) ^= (yyy) ^= (xxx) ^= (yyy)
    • Islam TeCNo
       
      I remember someone told me that it's impossible to do swaping using macros :D ...but i think it's possible
  • But recursion should not be discounted completely, as some problems are very easy to solve recursively (DFS, backtracking)
    • Islam TeCNo
       
      Some problems are much easier when using recursion
  • Having a good understanding of probability is vital to being a good programmer
  •  
    for bignner acmers hoping to be useful !
  •  
    in this article the author discuss the common problems that faced teams in ACM contests .. and how to avoid it !
1 - 20 of 31 Next ›
Showing 20 items per page