Skip to main content

Home/ Google AppEngine/ Group items matching "upload" in title, tags, annotations or url

Group items matching
in title, tags, annotations or url

Sort By: Relevance | Date Filter: All | Bookmarks | Topics Simple Middle
Esfand S

GWT, Blobstore, the new high performance image serving API, and cute dogs on office chairs « Ikai Lan says - 0 views

  • Blobstore crash course It’ll be best if we gave a quick refresher course on the blobstore before we begin. Here’s the standard flow for a blobstore upload: Create a new blobstore session and generate an upload URL for a form to POST to. This is done using the createuploadUrl() method of BlobstoreService. Pass a callback URL to this method. This URL is where the user will be forwarded after the upload has completed. Present an upload form to the user. The action is the URL generated in step 1. Each URL must be unique: you cannot use the same URL for multiple sessions, as this will cause an error. After the URL has uploaded the file, the user is forwarded to the callback URL in your App Engine application specified in step 1. The key of the uploaded blob, a String blob key, is passed as an URL parameter. Save this URL and pass the user to their final destination
Esfand S

Using the bulkloader with Java App Engine « Ikai Lan says - 0 views

  • I’m trying to use the bulkuploader for a java program but am running into an interesting issue. My PrimaryKey property is a Long, and in java I can explicitly give them id numbers and they show in the data store as “id=xxx”. When I download the data via the appcfg.py I get a reasonably looking data file. If I reupload the same file it actually inserts things into the data store with key “name=xxx” and therefore doubles every one of my entries.
  • create a custom uploader using the file upload example provided on appengine’s java FAQ.
  • App Engine’s datastore is schemaless. That is – it is possible to have Entities of the same Kind with completely different sets of properties. Most of the time, this is a good thing. MySQL, for instance, requires a table lock to do a schema update. By being schema free, migrations can happen lazily, and application developers can check at runtime for whether a Property exists on a given Entity, then create or set the value as needed. But there are times when this isn’t sufficient. One use case is if we want to change a default value on Entities and grandfather older Entities to the new default value, but we also want the default value to possibly be null.
  • ...1 more annotation...
  • I used a combination of uploading entire chunks of my data via Fileupload (see link below), and explicitly creating my Java objects with the keys that I wanted (which were easily implicitly defined by the data format as the first one would be ‘n’ and every object after it was n++). I would then insert the set of objects in bulk. The problem I hit the most was finding the right number of objects per store call. There are specific limits that make this process long and annoying. I ran something locally that would continue trying to upload the chunk of data until it got a good response from the server page. It took me something on the order of 6-8 hours to upload about 1.5M tiny objects. http://code.google.com/appengine/kb/java.html#fileforms
Esfand S

Does Eclipse upload 3rd-party GWT libraries to GAE? - Stack Overflow - 0 views

  • Cold-start latency is determined by the time it takes to load all the classes needed to handle the request. If you upload a JAR file, but nothing references it, it won't be loaded, and thus won't affect your cold-start latency.
  • Only those jars under WEB-INF/lib will be uploaded to GAE. You can prevent GWT jars from being uploaded by not placing them under WEB-INF/lib, rather by externally linking to them in your project build path.
Esfand S

Google App Engine for Java Questions - Google App Engine - Google Code - 0 views

  • How do I handle multipart form data? or How do I handle file uploads to my app? You can obtain the uploaded file data from a multipart form post using classes from the Apache Commons Fileupload package. Specifically you may want to use FileItemStream, FileItermIterator and ServletFileupload as illustrated below. If you see a java.lang.NoClassDefFoundError after starting your application, make sure that the Apache Commons Fileupload JAR file has been copied to your war/WEB-INF/lib directory and added to your build path. import org.apache.commons.fileupload.FileItemStream;import org.apache.commons.fileupload.FileItemIterator;import org.apache.commons.fileupload.servlet.ServletFileupload;import java.io.InputStream;import java.io.IOException;import java.util.logging.Logger;import javax.servlet.ServletException;import javax.servlet.http.HttpServlet;import javax.servlet.http.HttpServletRequest;import javax.servlet.http.HttpServletResponse;public class Fileupload extends HttpServlet {  private static final Logger log =      Logger.getLogger(Fileupload.class.getName());  public void doPost(HttpServletRequest req, HttpServletResponse res)      throws ServletException, IOException {    try {      ServletFileupload upload = new ServletFileupload();      res.setContentType("text/plain");      FileItemIterator iterator = upload.getItemIterator(req);      while (iterator.hasNext()) {        FileItemStream item = iterator.next();        InputStream stream = item.openStream();        if (item.isFormField()) {          log.warning("Got a form field: " + item.getFieldName());        } else {          log.warning("Got an uploaded file: " + item.getFieldName() +                      ", name = " + item.getName());          // You now have the filename (item.getName() and the          // contents (which you can read from stream).  Here we just          // print them back out to the servlet output stream, but you          // will probably want to do something more interesting (for          // example, wrap them in a Blob and commit them to the          // datastore).          int len;          byte[] buffer = new
Esfand S

Episode 13: Using the Blobstore Java API « Google App Engine Java Experiments - 0 views

  • The Blobstore API provides two types of functions:   An ability to upload and save the blob automaticallyThe BlobstoreService which is provided by the com.google.appengine.api.blobstore.BlobstoreService allows us to specify a URL where users can upload their large files. You can think of this url as the action element in the HTML form. The implementation at this URL is internal to the BlobstoreService. But what it does is significant. It will extract out the file contents that you uploaded and store it as a Blob in the database. Each blob that is stored in the database is associated with the a Blob Key. This Blob key is then provided to your url and you can then use the Blob Key to do anything within your application. In our case, we form a url that is tweeted to the users who can then view the picture that we uploaded. An ability to serve or retrieve the blob.The BlobstoreService also provides an ability to serve or retrieve the blob that was saved successfully. It will provide the blob as a response that could then use as a source in an <img> element for example. All you need to do is provide it the Blob Key and the response stream and in return, it will provide the content properly encoded as per its type that you could then use.
Esfand S

Using the App Engine Mapper for bulk data import « Ikai Lan says - 0 views

  • The most obvious use case is data import. A developer looking to import large amounts of data would take the following steps: Create a CSV file containing the data you want to import. The assumption here is that each line of data corresponds to a datastore entity you want to create Upload the CSV file to the blobstore. You’ll need billing to be enabled for this to work. Create your Mapper, push it live and run your job importing your data. This isn’t meant to be a replacement for the bulk Uploader tool; merely an alternative. This method requires a good amount more programmatic changes for custom data transforms. The advantage of this method is that the work is done on the server side, whereas the bulk Uploader makes use of the remote API to get work done. Let’s get started on each of the steps.
  • to build Mappers that map across some large, contiguous piece of data as opposed to Entities in the datastore
Esfand S

Uploading and Downloading Data - Google App Engine - Google Code - 0 views

  • import_transform A single-argument function that returns the correct value and type data based on the external_name or import_template strings. Examples include the built-in Python conversion operators (such as float), any of several helper functions provided in transform, such as get_date_time or generate_foreign_key, a function provided in your own library, or an in-line lambda function. Or, a two-argument function with the keyword argument bulkload_state, which on return contains useful information about the entity: bulkload_state.current_entity, which is the current entity being processed; bulkload_state.current_dictionary, the current export dictionary, and bulkload_state.filename, the --filename argument that was passed to appcfg.py.
  • import_template Specifies multiple dictionary items for a single property, using Python string interpolation.
Esfand S

Uploading and Downloading Data - Google App Engine - Google Code - 0 views

  • The data loader feature communicates with your application running on App Engine using remote_api, a request handler included with the App Engine runtime environment that allows remote applications with the proper credentials to access the datastore remotely. To use remote_api, you must map a URL to it.
Esfand S

Coding For Rent - 0 views

  • Now you can choose to put this in an existing version of your applications config.ru. Or you can choose to put it in a new version (ie. version just for bulk upload/download without your application code). Either way once a version with the RemoteApiServlet is uploaded run this command to download your data: bulkloader.py --dump --app_id="application id" --url="url_to_remote_api_servlet" --filename="file to download data to" Here is an example from one of my apps: bulkloader.py --dump --app_id=jsm277 --url=http://bulkmove.latest.jsm277.appspot.com/remote_api --filename=test_download_data This command will download every object in your applications Datastore. If you only want to download the objects of one kind then simple add the --kind= option. Another example: bulkloader.py --dump --app_id=jsm277 --url=http://bulkmove.latest.jsm277.appspot.com/remote_api --filename=test_download_data --kind=Posts In order to restore the data that you have just downloaded use this command: bulkloader.py --restore --url="url_to_remote_api_servlet" --filename="file name from dump command" --app_id="application id" Here is an example command: bulkloader.py --restore --url=http://bulkmove.latest.railsturbinetest.appspot.com/remote_api --filename=test_download_data --app_id=railsturbinetest There are several important thing to keep in mind when restoring the Datastore objectes. The restore command simple restores whatever is in the filename that you provide. So it does not matter if you dump your entire Datastore or just one kind you use the same command to restore them both. You can restore Datastore objects to a different application then the one that they were downloaded from. However, is important to keep in mind that the object's key is used to restore each object. So if there already esists an object in the Datastore with the same key as an object that is being restored the object in the Datastore will be overwritten The bulkloader.py is a good tool for moving data between applications on the Google App Engine and for locally backing up your data. However, it is not good for data conversion or manipulation in anyway since the data is stored in a binary form. So happy data moving.
Esfand S

Uploading files from the browser - Gaelyk | Google Groups - 0 views

  • Personally, I would recommend using the blobstore API for this since it is officially supported by google and means it won't break in the future. It also has nice bindings into the imageservice. http://code.google.com/appengine/docs/java/blobstore/overview.html
Esfand S

Google App Engine Cold Start Tips - Don't Use JSP - 0 views

  • this tip can cut a few hundred milliseconds or more off of your cold start time. Using JSPs slows you down in two ways: If you have a file with a .jsp extension, anywhere in your website directory, appcfg will detect that you use JSP, and it will add 8 libraries that are used for processing JSPs into your lib directory before uploading to the app engine.  These libraries total two megabytes in size, and simply having them in your lib directory slows your cold starts down by a few hundred milliseconds. If you are interested in what libraries these are, you can see them in the temporary folder appcfg creates while uploading your app. The first JSP file accessed after a cold start, even an empty one, will take a few hundred additional milliseconds to be processed.  I'm not sure what causes this, maybe it has to do with initializing the JSP processor.
Esfand S

How to use the data in local datastore uploaded by bulk loader? - Google App Engine for Java | Google Groups - 0 views

  • You can use the RemoteDatastore class to upload or download from a   normal Java application to your local or a remote datastore.  It takes   care of setting up a dummy Environment and ApiProxy.Delegate for you.   You can then use then read local files unrestricted and use the low- level api to insert your data. http://code.google.com/p/remote-datastore/ You just need to call RemoteDatastore.install() and ignore the other   steps about connecting to a remote datastore.
1 - 20 of 52 Next › Last »
Showing 20 items per page