Skip to main content

Home/ Computer Science Knowledge Sharing/ Group items tagged Algorithms

Rss Feed Group items tagged

Islam TeCNo

Design and Analysis of Computer Algorithms - 0 views

  • Dijkstra's Algorithm
    • Islam TeCNo
       
      Algorithm for finding shortest path is graph
  • Huffman's Codes
    • Islam TeCNo
       
      used in data compression
  •  
    recommended !! :D Mathematics for Algorithmic Sets Functions and Relations Vectors and Matrices Linear Inequalities and Linear Equations Greedy Algorithms Knapsack Problem o O-I Knapsack o Fractional Knapsack * Activity Selection Problem * Huffman's Codes * Minimum Spanning Tree * Kruskal's Algorithm * Prim's Algorithm * Dijkstra's Algorithm Divide & Conquer Algorithms Dynamic Programming Algorithms * Knapsack Problem DP Solution * Activity Selection Problem DP Solution Amortized Analysis * Aggregate Method * Accounting Method * Potential Method * Dynamic Table Hash Table Binary Search Tree Graph Algorithms * Breadth First Search (BFS) * Depth First Search (DFS) * Topological Sort * Strongly Connected Components * Euler Tour * Generic Minimum Spanning Tree * Kruskal's Algorithm * Prim's Algorithm * Single Source Shortest Path o Dijkstra's Algorithm o Bellman-Ford Algorithm String Matching * Naïve String Matching * Knuth-Morris-Pratt Algorithm * Boyer-Moore Algorithm Sorting * Bubble Sort * Insertion Sort * Selection Sort * Shell Sort * Heap Sort * Merge Sort * Quick Sort Linear-Time Sorting * Counting Sort * Radix Sort * Bucket Sort Computational Geometry Computational Complexity * Information-Theoretic Argument * Adversary Argument * NP-Completeness And Reduction Approximate Algorithms * Vertex Cover * The Traveling Salesman Problem Linear Programming Appendix 1. Parabola 2. Tangent Codes References hoping to discuss these algorithms with each other !
  •  
    this web page contain a lot of algorithms discussed with simple ways !! i think these maybe useful Tutorials !! hoping to discuss these algorithms with each other !
  •  
    Ohhhhhhh .....Gammmeeeeeeeeed gedan ya Mans ...thanks
Abdelrahman Ogail

Simulated annealing - Wikipedia, the free encyclopedia - 1 views

  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  • ...1 more annotation...
  • Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration — provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima—which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    Simulated annealing (SA) is a generic probabilistic metaheuristic for the global optimization problem of applied mathematics, namely locating a good approximation to the global minimum of a given function in a large search space. It is often used when the search space is discrete (e.g., all tours that visit a given set of cities). For certain problems, simulated annealing may be more effective than exhaustive enumeration - provided that the goal is merely to find an acceptably good solution in a fixed amount of time, rather than the best possible solution. The name and inspiration come from annealing in metallurgy, a technique involving heating and controlled cooling of a material to increase the size of its crystals and reduce their defects. The heat causes the atoms to become unstuck from their initial positions (a local minimum of the internal energy) and wander randomly through states of higher energy; the slow cooling gives them more chances of finding configurations with lower internal energy than the initial one. By analogy with this physical process, each step of the SA algorithm replaces the current solution by a random "nearby" solution, chosen with a probability that depends on the difference between the corresponding function values and on a global parameter T (called the temperature), that is gradually decreased during the process. The dependency is such that the current solution changes almost randomly when T is large, but increasingly "downhill" as T goes to zero. The allowance for "uphill" moves saves the method from becoming stuck at local minima-which are the bane of greedier methods. The method was independently described by S. Kirkpatrick, C. D. Gelatt and M. P. Vecchi in 1983 [1], and by V. Černý in 1985 [2]. The method is an adaptation of the Metropolis-Hastings algorithm, a Monte Carlo method to generate sample states of a thermodynamic system, invented by N. Metropolis et al. in 1953 [3].
  •  
    A natural AI approach
Abdelrahman Ogail

Genetic algorithm - Wikipedia, the free encyclopedia - 0 views

  • A genetic algorithm (GA) is a search technique used in computing to find exact or approximate solutions to optimization and search problems. Genetic algorithms are categorized as global search heuristics. Genetic algorithms are a particular class of evolutionary algorithms (EA) that use techniques inspired by evolutionary biology such as inheritance, mutation, selection, and crossover (also called recombination).
  • A typical genetic algorithm requires: a genetic representation of the solution domain, a fitness function to evaluate the solution domain.
  •  
    GE are primary used in Learning in AI
Abdelrahman Ogail

Genetic programming - Wikipedia, the free encyclopedia - 0 views

  • In artificial intelligence, genetic programming (GP) is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. Therefore it is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task.
  • In artificial intelligence, genetic programming (GP) is an evolutionary algorithm-based methodology inspired by biological evolution to find computer programs that perform a user-defined task. It is a specialization of genetic algorithms (GA) where each individual is a computer program. Therefore it is a machine learning technique used to optimize a population of computer programs according to a fitness landscape determined by a program's ability to perform a given computational task.
Abdelrahman Ogail

Stochastic optimization - Wikipedia, the free encyclopedia - 0 views

  • Stochastic optimization (SO) methods are optimization algorithms which incorporate probabilistic (random) elements, either in the problem data (the objective function, the constraints, etc.), or in the algorithm itself (through random parameter values, random choices, etc.), or in both [1]. The concept contrasts with the deterministic optimization methods, where the values of the objective function are assumed to be exact, and the computation is completely determined by the values sampled so far.
  •  
    In Artificial Intelligence, Genetic Algorithms belongs to class Stochastic search that is explained below
Abdelrahman Ogail

Hill climbing - Wikipedia, the free encyclopedia - 0 views

  • In computer science, hill climbing is a mathematical optimization technique which belongs to the family of local search. It is relatively simple to implement, making it a popular first choice. Although more advanced algorithms may give better results, in some situations hill climbing works just as well. Hill climbing can be used to solve problems that have many solutions, some of which are better than others. It starts with a random (potentially poor) solution, and iteratively makes small changes to the solution, each time improving it a little. When the algorithm cannot see any improvement anymore, it terminates. Ideally, at that point the current solution is close to optimal, but it is not guaranteed that hill climbing will ever come close to the optimal solution. For example, hill climbing can be applied to the traveling salesman problem. It is easy to find a solution that visits all the cities but will be very poor compared to the optimal solution. The algorithm starts with such a solution and makes small improvements to it, such as switching the order in which two cities are visited. Eventually, a much better route is obtained. Hill climbing is used widely in artificial intelligence, for reaching a goal state from a starting node. Choice of next node and starting node can be varied to give a list of related algorithms.
Abdelrahman Ogail

Common Mistakes in Online and Real-time Contests - 0 views

  • Dynamic programming problems are to be solved with tabular methods
    • Ahmed Mansour
       
      Dynamic programming, like the divide-and-conquer method, solves problems by combining the solutions to subproblems. ("Programming" in this context refers to a tabular method, not to writing computer code) y3ney 3bara 3n 2nene bn2sem el problem el kbirr le shwit probelsm so3'ira .. we ne solve el problems deh we ngma el yab2a dh 7l lel problem el kbira :D;d see introduction to algorithms book . chapter 15
  • breadth-first search
    • Ahmed Mansour
       
      In graph theory, breadth-first search (BFS) is a graph search algorithm that begins at the root node and explores all the neighboring nodes. Then for each of those nearest nodes, it explores their unexplored neighbor nodes, and so on, until it finds the goal. ya3ney be el 3arby keda lw ana 3ndy tree maslan we el tree dh bettkwen mn shwit levels 3ady gedan.. lama hagey 23mel search 3la node mo3ina fi el tree deh hamsk el tree mn el root bet3ha ely hwa level 0 we habda2 2mshy level by level y3ney hanzl 3la el level 1 we hakaz le 3'it mal2y el node bet3ty ,,,, see this ,, it's a tutorial show how BFS algorithm is work!! http://www.personal.kent.edu/~rmuhamma/Algorithms/MyAlgorithms/GraphAlgor/breadthSearch.htm
  • Memorize the value of pi You should always try to remember the value of pi as far as possible, 3.1415926535897932384626433832795, certainly the part in italics. The judges may not give the value in the question, and if you use values like 22/7 or 3.1416 or 3.142857, then it is very likely that some of the critical judge inputs will cause you to get the wrong answer. You can also get the value of pi as a compiler-defined constant or from the following code: Pi=2*acos(0)
    • Islam TeCNo
       
      hhhhhhhhhhh ...... awl mara a3rf el mawdo3 dah we awl mara a3raf en el Pi = 2*acos(0)
    • Abdelrahman Ogail
       
      Thanks Islam for the info, really useful
  • ...4 more annotations...
  • You cannot always check the equality of floating point numbers with the = = operator in C/C++. Logically their values may be same, but due to precision limit and rounding errors they may differ by some small amount and may be incorrectly deemed unequal by your program
  • #define swap(xxx, yyy) (xxx) ^= (yyy) ^= (xxx) ^= (yyy)
    • Islam TeCNo
       
      I remember someone told me that it's impossible to do swaping using macros :D ...but i think it's possible
  • But recursion should not be discounted completely, as some problems are very easy to solve recursively (DFS, backtracking)
    • Islam TeCNo
       
      Some problems are much easier when using recursion
  • Having a good understanding of probability is vital to being a good programmer
  •  
    for bignner acmers hoping to be useful !
  •  
    in this article the author discuss the common problems that faced teams in ACM contests .. and how to avoid it !
Janos Haits

Wolfram Cloud - 0 views

  •  
    "Integrated Access to Computational Intelligence The Wolfram Cloud combines a state-of-the-art notebook interface with the world's most productive programming language-scalable for programs from tiny to huge, with immediate access to a vast depth of built-in algorithms and knowledge. Learn more »"
Janos Haits

Quantum - Google Research - 0 views

  •  
    "A research effort from Google AI that aims to build quantum processors and develop novel quantum algorithms to dramatically accelerate computational tasks for machine learning."
Janos Haits

TensorFlow Quantum - 1 views

  •  
    "TensorFlow Quantum is a library for hybrid quantum-classical machine learning. TensorFlow Quantum (TFQ) is a quantum machine learning library for rapid prototyping of hybrid quantum-classical ML models. Research in quantum algorithms and applications can leverage Google's quantum computing frameworks, all from within TensorFlow."
Janos Haits

Wolfram Language for Knowledge-Based Programming - 0 views

  •  
    "Designed for the new generation of programmers, the Wolfram Language has a vast depth of built-in algorithms and knowledge, all automatically accessible through its elegant unified symbolic language. Scalable for programs from tiny to huge, with immediate deployment locally and in the cloud, the Wolfram Language builds on clear principles-and three decades of development-to create what promises to be the world's most productive programming language."
Janos Haits

Quantum Computing Playground - 0 views

  •  
    "Quantum Computing Playground is a browser-based WebGL Chrome Experiment. It features a GPU-accelerated quantum computer with a simple IDE interface, and its own scripting language with debugging and 3D quantum state visualization features. Quantum Computing Playground can efficiently simulate quantum registers up to 22 qubits, run Grover's and Shor's algorithms, and has a variety of quantum gates built into the scripting language itself."
Ahmed Mansour

Introduction to Design Patterns - 0 views

  • design pattern is a widely accepted solution to a recurring design problem in OOP a design pattern describes how to structure classes to meet a given requirement provides a general blueprint to follow when implementing part of a program does not describe how to structure the entire application does not describe specific algorithms focuses on relationships between classes
  • design patterns: make you more productive help you write cleaner code Observer and Singleton are just two of the many available if you like design patterns, try these resources: GoF book -- Design Patterns: Elements of Reusable Object-oriented Software design pattern examples in Java, see Design Patterns in Java Reference and Example Site
  • learn what a design pattern is
    • Ahmed Mansour
       
      link to download Design Patterns: Elements of Reusable Object-oriented Software book : http://rs638.rapidshare.com/files/242614498/Design_Patterns_Elements_Of_Reusable_Object_Oriented_Software.pdf
  •  
    in summary :D we can say that a design pattern is a general reusable solution to a commonly occurring problem in software design. and it gives the way and relation between the classes and object to solve a certain problem and it doesn't specity the final application here is a book which Tecno give it tom me http://www.4shared.com/file/111350944/8be77835/Dummies_-_DesignPattern.html hope that it will be usefull
Abdelrahman Ogail

CodeProject: C# vs C/C++ Performance. Free source code and programming help - 0 views

  • is compiled twice. Once while the program is written and second when the program is executed at the user's site. The first compilation is done by your C# builder and the second by the .NET Framework on the user's machine. The reason why C# compiled applications could be faster is that, during the second compilation, the compiler knows the actual run-time environment and processor type and could generate instructions that targets a specific processor.
  • A well designed C# program is more than 90% as fast as an equivalent "well-designed" C++ program
  • The problem with "not-freeing" the memory at the right time is that the working set of the application increases which increases the number of "page faults"
  • ...1 more annotation...
  • That's a nice question. Except for writing time-critical blocks of code, prefer C#. Write all your algorithmic code in C++ (not VC++ .NET), compile it into a dll and call that using a Dll Interop through C#. This should balance the performance. This technique is not new or not invented by me or anyone. It's similar the old age C programming vs Assembly, where people on one camp fight assembly programming is faster and the other camp stating C is easier to develop and then people started using assembly embedded within a C program for time-critical applications using an asm block.
  •  
    C# is compiled twice. Once while the program is written and second when the program is executed at the user's site. The first compilation is done by your C# builder and the second by the .NET Framework on the user's machine. The reason why C# compiled applications could be faster is that, during the second compilation, the compiler knows the actual run-time environment and processor type and could generate instructions that targets a specific processor
1 - 15 of 15
Showing 20 items per page