Skip to main content

Home/ Groups/ Google AppEngine
5More

Google App Engine Task Queues, Push vs. Pull Paradigm, and Web Hooks | Sachin Rekhi - 0 views

  • Task Queue is defined as a mechanism to synchronously distribute a sequence of tasks among parallel threads of execution. The global problem is broken down into tasks and the tasks are enqueued onto the queue. Parallel threads of execution pull tasks from the queue and perform computations on the tasks. The runtime system is responsible for managing thread accesses to the tasks in the queue as well as ensuring proper queue usage (i.e. dequeuing from an empty queue should not be allowed).
  • GAE Task Queue provides a push model. Instead of having an arbitrary number of worker processes constantly polling for available tasks, GAE Task Queue instead pushes work to workers when tasks are available. This work is then processed by the existing auto-scaling GAE infrastructure, allowing you to not have to worry about scaling up and down workers. You simply define the maximum rates of processing and GAE takes care of farming out the tasks to workers appropriately.
  • What is also compelling about GAE Task Queue is its use of web hooks as the description of a task unit. When you break it down, an individual task consists of the code to execute for that task as well as the individual data input for that specific task.
  • ...2 more annotations...
  • As far as the data input, the GET querystring params or HTTP POST body provide suitable mechanisms for providing any kind of input. In this way, a task description is simply a URL that handles the request and a set of input parameters to that request.
  • a given task has a 30 second deadline. That means any individual task cannot perform more than 30s of computation, including getting data from the data store, calling third party APIs, computing aggregations, etc.
1More

Has anyone got actually got tag files to work? - Google App Engine for Java | Google Gr... - 0 views

  • My tag files worked very well in dev server without any problem. When uploading to gae/j, I encountered the problem describe in the above mentioned issue and I could fix it according to the workaround guidelines provided in that issue. My environment: WinXP SP2 Eclipse version 3.4 GAE/J SDK 1.3.1 JDK 5.0
1More

Idempotence & multiple task execution - Google App Engine | Google Groups - 0 views

  • The decision to rerun a task is done based on the HTTP response code. There is always a response code, even when the connection is lost. When the code is 200 the task is considered complete and will not be rerun. Any other code means the task needs a rerun. The time between the reruns is increased with each retry. This means a certain task is never retried in parallel. But it could be that a task created later will finish first because it did not need to retry.
1More

Best method to store large dictionary into db? - google-appengine-python | Google Groups - 0 views

  • If you are trying to connect to a local server you will need to provide -s <server> arg Otherwise it will try to connect to <appid>.appspot.com
2More

Alternatives to exploding indexes ... - google-appengine-python | Google Groups - 0 views

  • You could create an entity group for this. Post:    data    category    sort UserPost:    user
  • if you can form the key_name for post and userpost like this Key('Post', '<data>') Key('Post', '<data>', 'UserPost', '<user>') Then you can perform a key_only query for a userpost and using the keys parents perform a get to retrieve the relevant userposts Adding new users incrementally to a Post is very simple and light weight. Deleting a Post would also require you to delete the UserPost children. These being in an entity group would provide for performing transactions... generally you wouldn't need to use them anyway... maybe when performing a full delete.
1More

Queries and Indexes - Google App Engine - Google Code - 0 views

  • Queries involving keys use indexes just like queries involving properties. Queries on keys require custom indexes in the same cases as with properties, with a couple of exceptions: inequality filters or an ascending sort order on __key__ do not require a custom index, but a descending sort order on __key__ does. As with all queries, the development web server creates appropriate configuration entries in this file when a query that needs a custom index is tested.
1More

mindash-datastore - Project Hosting on Google Code - 0 views

  • This project is a framework for storing entities larger than 1MB using the Google App Engine low-level datastore API.
2More

synchronous task queue support - Google App Engine for Java | Google Groups - 0 views

  • How about having task 1 queue task 2 and task 2 queuing task 3?
  • > The use case is that I have 3 tasks that I want to insert into the > queue. Each task takes about 20 seconds to complete.  The challenge is > that task # 3 depends on task # 2, and task # 2 depends on task # 1. > The question is, how can I insert those 3 tasks and expect task 1 be > finished before task 2 starts, and task 2 be finished before task 3 > starts?
2More

"Manual" UI testing with GWT and App Engine - Google App Engine for Java | Google Groups - 0 views

  • 've been able to accomplish what you're doing with Selenium testing. If you're using GWT, then your integration testing and user acceptance probably won't be that far from each other.
  • > My question is this. What's the best way to use LocalServiceTestHelper > so that I can use my app like it has a persistent store? There is no > 'setUp' and 'tearDown' hooks like a JUnit test and the app runs in a > separate process within the IDE. (And in any case, I'm looking to do > integration testing where I want the state to be consistent across a > number of page requests.)
1More

Arachnid's bulkupdate at master - GitHub - 0 views

  • The App Engine bulk update library is a library for the App Engine Python runtime, designed to facilitate doing bulk processing of datastore records. It provides a framework for easily performing operations on all the entities matched by any arbitrary datastore query, as well as providing an administrative interface to monitor and control batch update jobs.
3More

Task Queue task chaining done right - Nick's Blog - 0 views

  • One common pattern when using the Task Queue API is known as 'task chaining'. You execute a task on the task queue, and at some point, determine that you're going to need another task, either to complete the work the current task is doing, or to start doing something new.
  • The solution is straightforward: use task names. Task names ensure that the chained task is only enqueued once, even if the task doing the chaining gets re-executed repeatedly. There are many ways to derive a task name;
  • If you want to see this in practice, the bulkupdate library makes use of this pattern, constructing task names from the datastore ID of the bulkupdate job (to ensure uniqueness) and the number of the chained task (to ensure consistency).
1More

data structure to implement "some of your friends also like this" feature. - Google App... - 0 views

  • A query like this will use the merge join strategy by default. If it's too slow for the merge join strategy, you could have to build a custom index for it, which would indeed be 'exploding'  - each User entity would have len(friends)*len(likes) index entries.
1More

test unit doesn't work more - Google App Engine for Java | Google Groups - 0 views

  •  If you're following the how-to article on unit testing ( http://code.google.com/appengine/docs/java/howto/unittesting.html) you'll need to update your TestEnvironment class to look like this one: http://code.google.com/p/datanucleus-appengine/source/browse/branches... (lines 34 - 66) We're working on getting the docs updated right now.
1More

Looking for Bulk Loader beta testers - Google App Engine | Google Groups - 0 views

  • for Java developers, there is a remote API handler available; adding following to your web xml file should work (disclaimer: I have not yet personally tested this.) <servlet>   <servlet-name>remoteapi</servlet-name>   <servlet-class>com.google.apphosting.utils.remoteapi.RemoteApiServlet</servlet-class> </servlet> <servlet-mapping>   <servlet-name>remoteapi</servlet-name>   <url-pattern>/remote_api</url-pattern> </servlet-mapping> <security-constraint>   <web-resource-collection>     <web-resource-name>remoteapi</web-resource-name>     <url-pattern>/remote_api</url-pattern>   </web-resource-collection>   <auth-constraint>     <role-name>admin</role-name>   </auth-constraint> </security-constraint>
1More

A good data model for finding a user's favorite stories - Stack Overflow - 0 views

  • You can optimise it further, however: For each favorite, create a 'UserFavorite' entity as a child entity of the relevant Story entry (or equivalently, as a child entity of a UserInfo entry), with the key name set to the user's unique ID. This way, you can determine if a user has favorited a story with a simple get: UserFavorite.get_by_name(user_id, parent=a_story) get operations are 3 to 5 times faster than queries, so this is a substantial improvement.
2More

Uploading and Downloading Data - Google App Engine - Google Code - 0 views

  • import_transform A single-argument function that returns the correct value and type data based on the external_name or import_template strings. Examples include the built-in Python conversion operators (such as float), any of several helper functions provided in transform, such as get_date_time or generate_foreign_key, a function provided in your own library, or an in-line lambda function. Or, a two-argument function with the keyword argument bulkload_state, which on return contains useful information about the entity: bulkload_state.current_entity, which is the current entity being processed; bulkload_state.current_dictionary, the current export dictionary, and bulkload_state.filename, the --filename argument that was passed to appcfg.py.
  • import_template Specifies multiple dictionary items for a single property, using Python string interpolation.
1More

Uploading and Downloading Data - Google App Engine - Google Code - 0 views

  • The data loader feature communicates with your application running on App Engine using remote_api, a request handler included with the App Engine runtime environment that allows remote applications with the proper credentials to access the datastore remotely. To use remote_api, you must map a URL to it.
2More

bulkloader.py Authentication - Google App Engine | Google Groups - 0 views

  • I am trying to dump the data created in a Java app engine application.  So, I have two versions of the app - the live java version and the python version that hosts /remote_api .  (this caused the url confusion)
  • Perhaps you can try specifying app_id explicitly by adding "--app-id='yourappid'".
1More

Advanced Bulk Loading Part 5: Bulk Loading for Java - Nick's Blog - 0 views

  •  
    This article is outdated. There is a native java remote-api servelet now.
« First ‹ Previous 101 - 120 Next › Last »
Showing 20 items per page