Skip to main content

Home/ Computer Science Knowledge Sharing/ Group items tagged search

Rss Feed Group items tagged

Abdelrahman Ogail

Simulated annealing - Wikipedia, the free encyclopedia - 1 views

  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • ...1 more annotation...
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration - provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima-which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    A natural AI approach
Janos Haits

searchcode | source code search engine - 0 views

  •  
    'Type in anything you want to find and you will be presented with the results that match with the relevant lines highlighted. Searches can filtered down using the filter panel. Some suggested search terms,'
Janos Haits

AI Academic Research Reading App: AI Literature Search Tool; 250 M+ Research Papers | R... - 0 views

  •  
    "Your #1 companion for AI literature search! Academic research reading made easy. Access 250M+ research papers from the most preferred AI literature search tool."
Janos Haits

Structured Data Linter - 0 views

  •  
    The Structured Data Linter is a tool aiding webmasters and web developers to verify the structured data present in their HTML pages. Search engines use structured data to understand webpages more accurately and to present enhanced search results. The Linter understands the microdata, JSON-LD and RDFa formats according to their latest specifications. Note however that it does not guarranty that all consumers (e.g. search engines) will make use of all the structured data available in your page. The linter does not currently support microformats (contributions welcome).
Janos Haits

ipfs-search.com - 1 views

  •  
    "Search the Distributed Web"
Janos Haits

Consensus - Evidence-Based Answers, Faster - 0 views

  •  
    "AI Search Engine for Research. Consensus is a search engine that uses AI to find insights in research papers"
Abdelrahman Ogail

Genetic algorithm - Wikipedia, the free encyclopedia - 0 views

  • A genetic algorithm (GA) is a search technique used in computing to find exact or approximate solutions to optimization and search problems. Genetic algorithms are categorized as global search heuristics. Genetic algorithms are a particular class of evolutionary algorithms (EA) that use techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover (also called recombination).
  • A typical genetic algorithm requires: a genetic representation of the solution domain, a fitness function to evaluate the solution domain.
  •  
    GE are primary used in Learning in AI
Janos Haits

Coral AI: Search & Summarize Documents with AI - 0 views

  •  
    "Search & Summarize Documents with AI Cut your reading time in half - upload a PDF to get answers, summaries, translations, and citations in seconds."
Janos Haits

Zakta Technology Platform - OEM Collaborative Search and Social Curation - 0 views

  •  
    Zakta is the world's leading Social Curation Platform. Zakta helps power the next generation of social applications with search, curation and collaboration technologies.
Abdelrahman Ogail

Common Mistakes in Online and Real-time Contests - 0 views

  • Dynamic programming problems are to be solved with tabular methods
    • Ahmed Mansour
       
      Dynamic programming, like the divide-and-conquer method, solves problems by combining the solutions to subproblems. ("Programming" in this context refers to a tabular method, not to writing computer code) y3ney 3bara 3n 2nene bn2sem el problem el kbirr le shwit probelsm so3'ira .. we ne solve el problems deh we ngma el yab2a dh 7l lel problem el kbira :D;d see introduction to algorithms book . chapter 15
  • breadth-first search
    • Ahmed Mansour
       
      In graph theory, breadth-first search (BFS) is a graph search algorithm that begins at the root node and explores all the neighboring nodes. Then for each of those nearest nodes, it explores their unexplored neighbor nodes, and so on, until it finds the goal. ya3ney be el 3arby keda lw ana 3ndy tree maslan we el tree dh bettkwen mn shwit levels 3ady gedan.. lama hagey 23mel search 3la node mo3ina fi el tree deh hamsk el tree mn el root bet3ha ely hwa level 0 we habda2 2mshy level by level y3ney hanzl 3la el level 1 we hakaz le 3'it mal2y el node bet3ty ,,,, see this ,, it's a tutorial show how BFS algorithm is work!! http://www.personal.kent.edu/~rmuhamma/Algorithms/MyAlgorithms/GraphAlgor/breadthSearch.htm
  • Memorize the value of pi You should always try to remember the value of pi as far as possible, 3.1415926535897932384626433832795, certainly the part in italics. The judges may not give the value in the question, and if you use values like 22/7 or 3.1416 or 3.142857, then it is very likely that some of the critical judge inputs will cause you to get the wrong answer. You can also get the value of pi as a compiler-defined constant or from the following code: Pi=2*acos(0)
    • Islam TeCNo
       
      hhhhhhhhhhh ...... awl mara a3rf el mawdo3 dah we awl mara a3raf en el Pi = 2*acos(0)
    • Abdelrahman Ogail
       
      Thanks Islam for the info, really useful
  • ...4 more annotations...
  • You cannot always check the equality of floating point numbers with the = = operator in C/C++. Logically their values may be same, but due to precision limit and rounding errors they may differ by some small amount and may be incorrectly deemed unequal by your program
  • #define swap(xxx, yyy) (xxx) ^= (yyy) ^= (xxx) ^= (yyy)
    • Islam TeCNo
       
      I remember someone told me that it's impossible to do swaping using macros :D ...but i think it's possible
  • But recursion should not be discounted completely, as some problems are very easy to solve recursively (DFS, backtracking)
    • Islam TeCNo
       
      Some problems are much easier when using recursion
  • Having a good understanding of probability is vital to being a good programmer
  •  
    for bignner acmers hoping to be useful !
  •  
    in this article the author discuss the common problems that faced teams in ACM contests .. and how to avoid it !
Janos Haits

Citavi - Reference Management and Knowledge Organization - 0 views

  •  
    "Search databases and library catalogs directly from within Citavi. Save results to your project with a click. Surf and save: when you find a book, article, or webpage online, use the Picker to quickly add its information to Citavi. Save copies of webpages as PDFs. Find and save all available PDF full text articles in Citavi. Everything in one place and always at hand."
Janos Haits

Zotero | Home - 0 views

  •  
    A personal research assistant. Zotero is the only research tool that automatically senses content, allowing you to add it to your personal library with a single click. Whether you're searching for a preprint on arXiv.org, a journal article from JSTOR, a news story from the New York Times, or a book from your university library catalog, Zotero has you covered with support for thousands of sites.
Janos Haits

Semantic Scholar - 0 views

  •  
    "Semantic Scholar is a free, nonprofit, academic search engine from AI2."
Janos Haits

OAS - 0 views

  •  
    "Advance scientific research. Promote technology. For the good of all humanity. Open Academic Search (OAS) is a working group aiming to advance scientific research and discovery, promote technology that assists the scientific and academic communities, and make research available worldwide for the good of all humanity."
Janos Haits

Zendy | AI-Powered Research Library - 0 views

  •  
    "Empowering knowledge with every search Explore open access and paywalled academic literature across all disciplines Trusted by over 675.3K readers Access over 40.7M research publications Download over 1.1M e-Books Find res"
Janos Haits

Echoes - ChatGPT & Claude Conversation Search and Management - 0 views

  •  
    "Take Control of Your AI Conversations Across ChatGPT, Claude, and Gemini! Search, organize, and summarize your AI interactions seamlessly-all in one powerful tool designed to save you time and boost your productivity."
Islam TeCNo

Design and Analysis of Computer Algorithms - 0 views

  • Dijkstra's Algorithm
    • Islam TeCNo
       
      Algorithm for finding shortest path is graph
  • Huffman's Codes
    • Islam TeCNo
       
      used in data compression
  •  
    recommended !! :D Mathematics for Algorithmic Sets Functions and Relations Vectors and Matrices Linear Inequalities and Linear Equations Greedy Algorithms Knapsack Problem o O-I Knapsack o Fractional Knapsack * Activity Selection Problem * Huffman's Codes * Minimum Spanning Tree * Kruskal's Algorithm * Prim's Algorithm * Dijkstra's Algorithm Divide & Conquer Algorithms Dynamic Programming Algorithms * Knapsack Problem DP Solution * Activity Selection Problem DP Solution Amortized Analysis * Aggregate Method * Accounting Method * Potential Method * Dynamic Table Hash Table Binary Search Tree Graph Algorithms * Breadth First Search (BFS) * Depth First Search (DFS) * Topological Sort * Strongly Connected Components * Euler Tour * Generic Minimum Spanning Tree * Kruskal's Algorithm * Prim's Algorithm * Single Source Shortest Path o Dijkstra's Algorithm o Bellman-Ford Algorithm String Matching * Naïve String Matching * Knuth-Morris-Pratt Algorithm * Boyer-Moore Algorithm Sorting * Bubble Sort * Insertion Sort * Selection Sort * Shell Sort * Heap Sort * Merge Sort * Quick Sort Linear-Time Sorting * Counting Sort * Radix Sort * Bucket Sort Computational Geometry Computational Complexity * Information-Theoretic Argument * Adversary Argument * NP-Completeness And Reduction Approximate Algorithms * Vertex Cover * The Traveling Salesman Problem Linear Programming Appendix 1. Parabola 2. Tangent Codes References hoping to discuss these algorithms with each other !
  •  
    this web page contain a lot of algorithms discussed with simple ways !! i think these maybe useful Tutorials !! hoping to discuss these algorithms with each other !
  •  
    Ohhhhhhh .....Gammmeeeeeeeeed gedan ya Mans ...thanks
Abdelrahman Ogail

Stochastic optimization - Wikipedia, the free encyclopedia - 0 views

  • Stochastic optimization (SO) methods are optimization algorithms which incorporate probabilistic (random) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself (through random parameter values, random choices, etc.), or in both [1]. The concept contrasts with the deterministic optimization methods, where the values of the objective function are assumed to be exact, and the computation is completely determined by the values sampled so far.
  •  
    In Artificial Intelligence, Genetic Algorithms belongs to class Stochastic search that is explained below
Janos Haits

Knil | The best way to search and find when you're mobile - 0 views

  •  
    "Knil structures the information of the web into actionable knowledge"
1 - 20 of 64 Next › Last »
Showing 20 items per page