Skip to main content

Home/ Google AppEngine/ Group items tagged datastore

Rss Feed Group items tagged

Esfand S

mindash-datastore - Project Hosting on Google Code - 0 views

  • This project is a framework for storing entities larger than 1MB using the Google App Engine low-level datastore API.
Esfand S

Parallel Asynchronous Datastore Commands with Twig 1.0 - Google App Engine for Java | G... - 1 views

  •  Twig is an alternative to the standard   persistence interfaces JDO and JPA, specifically designed to make the   most of the underlying datastore's unique features.
  • Twig is the only interface to support direct unowned relationships so   your code is not dependent on low-level datastore classes.  It is the   only interface to support OR queries by merging together multiple   queries, sorting them and filtering out duplicates at the lowest level   for performance.
  • Async datastore calls are not yet part of the low-level API.  Twig   uses the underlying Protocol Buffer layer classes to call   ApiProxy.makeAsyncCall() instead of makeSyncCall.  All the code is   open source so you can check out how its done.
Esfand S

Delete all from datastore - Google App Engine for Java | Google Groups - 0 views

  • All you can do is fetch the entities of the type you want from the datastore and delete them one by one. (call datastore.delete(Iterable<Entity>))This way you can delete about 1000 entities at each request... There is no way to delete all entities of a Kind by a single api call.
  • > > If there is a way to list current Kinds or select all entities of any > > kind I could manage this.
Esfand S

Difference between Kind and Entity in GAE datastore? - Stack Overflow - 0 views

  • An Entity is an individual record that gets stored and retrieved from the datastore. The Kind is the unique string identifier of the type of entity. For example, "Joe" is an Entity with age=42, dob=10-12-2000, and Kind "Person".
Esfand S

GAE load data into datastore without using CSV - Stack Overflow - 0 views

  • Basically, all you have to do is create a subclass of bulkloader.Loader and implement (at a minimum) the generate_records method, which should yield lists of strings. This same strategy would work for loading data from XML files or ROT13-encrypted files or whatever. Note that the list of strings yielded by the generate_records method must match up (in length and order) with the "properties" list you provide when you initialize the loader (ie, the second argument to the AlbumLoader.__init__ method in this example). This approach actually provides a lot of flexibility: We're overriding the __init__ method on our JSONLoader implementation and automatically determining the kind of model we're loading and its list of properties to provide to the bulkloader.Loader parent class.
Esfand S

Creating and storing unique values - Google App Engine for Java | Google Groups - 0 views

  • You might want to instead use datastore's built in ability to   efficiently create unique long ids and then when sending them to a   client "muddle" them using a reversible hash function so they don't   appear predictable.  You will find that using long ids results in   shorter Keys which saves space in the datastore. You can only guarantee a unique value in the datastore by creating an   Entity with the value in its key.  This does not need to be your   "main" entity - just a special UniqueName type.  Then you can
Esfand S

Retrieving an entity from GAE datastore by key - Stack Overflow - 0 views

  • db.get('ag1iYXRjaC1nZW5lcmljchcLEgxCYXRjaGVzTW9kZWwiBUpvYiAyDA')
Esfand S

Using the App Engine Mapper for bulk data import « Ikai Lan says - 0 views

  • The most obvious use case is data import. A developer looking to import large amounts of data would take the following steps: Create a CSV file containing the data you want to import. The assumption here is that each line of data corresponds to a datastore entity you want to create Upload the CSV file to the blobstore. You’ll need billing to be enabled for this to work. Create your Mapper, push it live and run your job importing your data. This isn’t meant to be a replacement for the bulk uploader tool; merely an alternative. This method requires a good amount more programmatic changes for custom data transforms. The advantage of this method is that the work is done on the server side, whereas the bulk uploader makes use of the remote API to get work done. Let’s get started on each of the steps.
  • to build Mappers that map across some large, contiguous piece of data as opposed to Entities in the datastore
Esfand S

SQLite Datastore in 1.3.3 doesn't seem to be working - google-appengine-python | Google... - 0 views

  • You can tell if it's using sqlite by executing 'sqlite3 /path/to/datastore' on the command line, and seeing if sqlite recognizes it as a valid DB. The code is essentially the same as that which I announced on my blog, so there shouldn't be significant performance differences.
Esfand S

GAE/J datastore backup - Stack Overflow - 0 views

  • Just set up remote_api for your app using the directions here - notably the tip: Tip: If you have a Java app, you can use the Python bulkloader.py tool by installing the Java version of the remote_api handler, which is included with the Java runtime environment. The handler servlet class is com.google.apphosting.utils.remoteapi.RemoteApiServlet. Then, use the Python bulkloader with --dump or --restore.
Esfand S

owned one to many relationship problem - Google App Engine for Java | Google Groups - 0 views

  • Because A is the entity group parent of B, the key of an instance of B must contain information about its entity group parent instance of A. Instead of using b.key.getId() in   // We cannot retrieve this object. Why?   B newB = pm.getObjectById(B.class,b.key.getId()); you might care to use the KeyFactory.Builder class as detailed in "http://code.google.com/intl/en/appengine/docs/java/datastore/ creatinggettinganddeletingdata.html#Creating_and_Using_Keys" in order to build a key instance which contains information about both your B instance and its parental A instance.
Esfand S

Using the Java Mapper Framework for App Engine « Ikai Lan says - 0 views

  • to write large batch processing jobs without having to think about the plumbing details.
  • it is a very easy way to perform some operation on every single Entity of a given Kind in your datastore in parallel
  • What would you have to build for yourself if Mapper weren’t available? Begin querying over every Entity in chained Task Queues Store beginning and end cursors (introduced in 1.3.5) Create tasks to work with chunks of your datastore Write the code to manipulate your data Build an interface to control your batch jobs Build a callback system for your multitudes of parallelized workers to call when the entire task has completed It’s certainly not a trivial amount of work
  • ...1 more annotation...
  • Some things you can do very easily with the Mapper library include: Modify some property or set of properties for every Entity of a given Kind Delete all entities of a single Kind – the functional equivalent of a “DROP TABLE” if you were using a relational database Count the occurrences of some property across every single Entity of a given Kind in your datastore
Esfand S

memcache best practice or framework - Google App Engine for Java | Google Groups - 0 views

  • I made some code public that does what you describe: it is a simple   cache interface that has implementations for in-memory, memcache and   the datastore.  You get about 100MB of heap space to use which can   significantly speed up your caching. There is also a CompositeCache class that allows you to layer the   caches so that it first checks in-memory, then memcache , then the   datastore.  Puts go to all levels and cache hits refresh the higher   levels.  e.g. if an item is not in-memory and has been flushed from   memcache but is still present in the datastore then the other two will   be updated.
‹ Previous 21 - 40 of 232 Next › Last »
Showing 20 items per page