From 64705a75b6ece818c1a5c3c55ff686ea155c856b Mon Sep 17 00:00:00 2001 From: Sebastian Thiel Date: Sun, 25 Sep 2022 19:56:50 +0800 Subject: [PATCH] Release google-apis-common v4.0.0 --- google-apis-common/CHANGELOG.md | 2403 ++++++++++--------------------- 1 file changed, 756 insertions(+), 1647 deletions(-) diff --git a/google-apis-common/CHANGELOG.md b/google-apis-common/CHANGELOG.md index 7079605c4a..83f0db71f9 100644 --- a/google-apis-common/CHANGELOG.md +++ b/google-apis-common/CHANGELOG.md @@ -5,7 +5,212 @@ All notable changes to this project will be documented in this file. The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/), and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html). -## Unreleased +## 4.0.0 (2022-09-25) + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + The initial release of the crate that is shared among all google-api crates. This allows trait-implementations to be re-used as they are not unique to their respective crate. @@ -398,6 +603,40 @@ you need. ### Documentation + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - add lang attribute to docs index Use the W3 standard "lang" attribute on the "html" tag to provide context for browsers and screen readers. @@ -435,111 +674,165 @@ you need. [skip ci] - update STRUCT_FLAG and UPLOAD flags * adjust documentation to resemble actual upload flag semantics. It was - still using the one previously used in docopt. - * Make -m optional, defaulting to 'application/octet-stream' - - Should have been fixed alongside of #81 - - visual gap between cursor and kv - Previously, the space was barely visible, confusing even myself :). - Now it's clear, using 4 spaces, that there is a cursor invocation - followed by a key-value pair. - - add link to general documentation - [skip ci] - - request values are moved, not borrowed - [skip ci] - - filled README.md - All possible documentation was added in a quality sufficient for - a first release. After all, everything there is is documented. - - integrate different program types - * put program type inforamtion into shared.yaml to allow accessing it + still using the one previously used in docopt. +* Make -m optional, defaulting to 'application/octet-stream' +* put program type inforamtion into shared.yaml to allow accessing it from the index.html.mako template. - - random values + cursor information - * Instead of writing pod-types, we generate a random value of the +* Instead of writing pod-types, we generate a random value of the required type. - * Fully document how cursors can be set, which is all that's usually +* Fully document how cursors can be set, which is all that's usually demonstrated in more complex dynamic structure documentation - - absolute top-level cursor + details - * just for show, use absolute cursors in the top-level structure - * indicate you are setting an array or hashmap in the details - - relative cursor positioning - It would still be nice though to show absolute positioning as well. - - dynamic absolute cursor position example - We build all required -r flags using absolute cursor positions only. - The next step should be to use relative ones, and of course be more - verbose about how this should be interpreted (sequential). - - upload and output flag - We are already there, except for documenting the request value type, - which definitely deserves a separate issue. - - optional paramters - Added documentation for flags setting all kinds of optional parameters. - - inforamtion about setting structs - For now we just have a 'dum' example, but once we are there, we shall - make the example and documentation based on the actual request value. - - This requires some additional work, which fortunately has to be done - in python only. - - add required scalar arguments - - name default scope in API docs - - added CLI scope documentation - In addition to that, they can now be set as well. - Unified generation of the 'default' scope. - - update to include CLI targets - - minor phrasing changes - Also removed superfluous 'extern' for tests - - deal with 'virtual' methods resource - We assure to know about it, instead of writing nonsense about that - 'methods' resources which does not actually exist. - - I am relatively sure to have found all the spots. - - method features and general info - * add method listing for various categories, like 'downloads' and +* just for show, use absolute cursors in the top-level structure +* indicate you are setting an array or hashmap in the details +* add method listing for various categories, like 'downloads' and 'uploads' - * add general information on how to do downloads and uploads using +* add general information on how to do downloads and uploads using various protocols - - add build instructions - These should help people to get started on their own. - - initial version - It's still rather simple, but a basis for further improvements - - result handling and remaining todos - Basically there is no todo left, which puts us in a good position for - implementing more features, and get some feedback in the meanwhile. - - bigger font for doc-index - - for additional parameters - Based on the parameters suitable for the entire API. One could also - make them available in the builder ... . - - cross linking of resources/activities - This makes it so much easier to get to the example call you are - interested in. - It's getting there, slowly ;) - - docs for terms.upload methods - Also fs::File is now used with prefix, to prevent clashes. - - scope docs for method builders - - fixed spacing - Also, the `do()` implementation was moved into it's own def, even - though it's still quite empty. - - improved spacing - - added info about settable parts - It's not as good as the parts info on the website, but it's something ! - At least people don't have to read the text, but find this information - in all the spots that are relevant to this. - - more information, nicer visuals - - method builder call example - With nearly fully randomized examples to show how it can be done. - It's quite nice to see actual calls, using everything required to get - a call. The only thing the user has to manage is to fill in actual - values. - - But, it also shows that our builder pattern doesn't work yet due to ... - you guessed it ... lifetime issues :D - - library overview as far as possible - Everything we have, feature wise, is now documented in a first version - at least. - - We shall keep this uptodate with what we are implementing, which also - helps figuring out a good api. ### New Features + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - Support custom connectors Switch the constraints on Hub types to use public traits based on tower::service, as recommended by Hyper. This enables support for @@ -578,682 +871,320 @@ you need. resources. - added back-link to crates.io * url is created per-API and features a nice crates image coming - from githubusercontent. - - did you mean for struct values - * functionality is cursor-aware, and fixes the actual string the user + from githubusercontent. +* functionality is cursor-aware, and fixes the actual string the user passed in. That way, it is made very clear how the suggested value is to be used. - * it's a known weakness of the implementation that it operates on a +* it's a known weakness of the implementation that it operates on a flattened list of field names, and thus may make nonsensical suggestions. - * added punctuation to all errors - - `-u ` parsing - * As `possible_values()` applies to all arguments, we cannot use it +* added punctuation to all errors +* As `possible_values()` applies to all arguments, we cannot use it anymore but have to check the UploadProtocol type ourselves. Besides that, switching to the latest `clap` simplified our lives a little. - * ajusted docs to not enforce using `-r` all the time - - adjust to serde usage in `yup-oauth` - * More detailed error type for JsonTokenStorage - * removed all traces of rustc_serialize - * use pretty-printers everywhere to allow writing human-readable json +* ajusted docs to not enforce using `-r` all the time +* More detailed error type for JsonTokenStorage +* removed all traces of rustc_serialize +* use pretty-printers everywhere to allow writing human-readable json files for secretes and for tokens - - implement -u as good as possible - We can't have the `-u ` style yet, but - https://github.com/kbknapp/clap-rs/issues/88 might help with that - at some point. - - Related to #92 and #81 - - parse structure and build App - We are currently setting everything up at runtime, and manage to get - nearly all information into it, except for the more complex - `-u (simple|resumable) ` flag. - - initial version of command generation - It compiles and works, even though there are many things we want to - improve. - - One big question is how to define multi-arguments, like -u foo bar baz. - - setup infrastructure - This allows us to setup clap and see if it compiles, which is the prime - goal of the current workflow step. - - Related to #81 - - simple linux deployment script - It's made for a linux machine, not for docker - - simple osx deploy script - * added simple script to build tar archive with all debug/release +* added simple script to build tar archive with all debug/release binaries. - * slightly improved docker script, even though it would need additional +* slightly improved docker script, even though it would need additional work. For now, I use the cloud VM anyway - - improved error handling - We are now able to decode detailed errors and pass them on. This allows - the CLI to provide more useful error responses. - Additionally, the CLI will only print debug responses in --debug mode. - - per-API-credentials with default - That way, we can provide better service, as CLIs that consume a lot of - quota can easily have their own app credentials, and with it, their - own quota. - - The fallback will be a project that allows to use all possible - google APIs. - - The user can always put in his own application secret to use his own - quota or even paid services. - - hashmap handling - * with native support for type conversion and error handling - * improved hash-map key-value parsing to at least state that it knows +* with native support for type conversion and error handling +* improved hash-map key-value parsing to at least state that it knows it's dealing with a hashmap. Error text is still not what it should be because we don't know at runtime (initially) what type we handle. - - repeated required args - * Seem to work for docopt, mkdocs and code itself - * mkdocs now show type of required params - * some code which deals with converting elements to their +* Seem to work for docopt, mkdocs and code itself +* mkdocs now show type of required params +* some code which deals with converting elements to their target types is totally untested right now. - - Related to #77 - - --debug-auth flag - * Allow to see all authentication related communication, similar to +* Allow to see all authentication related communication, similar to --debug flag otherwise. - * fixed broken generator when handling request value parsing. - - --debug flag to output traffix - * If `--debug` is set, we will output all server communication to +* fixed broken generator when handling request value parsing. +* If `--debug` is set, we will output all server communication to stderr. That way, we can compare our requests to what is expected by ush based on official docs. - * `discovery` now doesn't use the API key anymore - this is specified +* `discovery` now doesn't use the API key anymore - this is specified using a custom override. - - Nice, we are totally ready to test and fix all API features. - - Related to #70 - - added first versions of all CLI - That way, changes can be tracked. - Also, we make it official. - - Future checkins will only be made if major changes were done, - similar to how the APIs are handled. - - Related to #64 - - struct value parsing - This works already for simple request values, but doens't generate - compiling code for structures with Parts in them. - Nonetheless, it's a big step towards finishing the overall issue. - - Related to #64 - - field cursor complete and untested - Tests just need to be run, and of course, the impementation might need - fixing. - - Related to #64 - - make respective uppload_call - Now we actually provide the information required to upload data in a - simple or resumable fashion. - - upload flag parsing - We handle errors gracefully with costum types and minimal amount of - code. Unfortunately, Mime type parsing is very 'flexible', allowing - nonesense types to be passed easily. - - Related to #62 - - global optional parameters+DL tracking - * set globally shared parameters (which includes 'alt') - * track if 'alt' is set to 'media' at runtime to do the right thing when +* set globally shared parameters (which includes 'alt') +* track if 'alt' is set to 'media' at runtime to do the right thing when outputting the result. There is still an issue to be fixed though - - Related to #61 - - parse method parameters and set them - It's implemented in a working fashion, except that the default value - is not currently set to something sensible, causing duplicate errors in - case the key-value syntax is wrong. - - Related to #61 - - handle output json encoding and ostreams - * support for encoding response schemas to json - * support for simple downloads (without alt=media) - - interpret output arguments - For now we don't properly handle errors when opening files, but the - code is there. - Will panic in next commit. - - Related to #63 - - required arg parsing + first doit() call - We are parsing required scalar values and handle parse-errors correctly, - to the point were we make a simple, non-upload doit() call. - - It shows that we seem to build invalid calls, for now,but that's nothing - we can't fix once the time is ripe. - - Next goals will be related to finalizing the argument parsing code. - - infrastructure for call and dry-run - Now we are able to cleanly handle our arguments on a per-method basis. - The generated code won't clutter our design as we put the details into - their own methods. - - Implementation of JsonTokenStorage - It's also used by the code, replacing the previous standing, - MemoryStorage. - - init hub + refactor for dry-run mode - The hub is just using preset types - we will have to implement our own - storage and auth-delegate, as well as a Hub delegate at some point. - - Dry run mode allows us to check for errors and use a call builder - using the very same code. - - Display + Error traits for Error struct - * improved documentation about error handling, it's less verbose yet +* support for encoding response schemas to json +* support for simple downloads (without alt=media) +* improved documentation about error handling, it's less verbose yet explains what you can do. - - engine checks resource and method args - We are now at a spot where we can actually start parsing arguments. - - * ArgumentError -> ClIError - seems more fitting - - write default and read app-secret - * if there is no secret file in json format, we write a default one +* ArgumentError -> ClIError - seems more fitting +* if there is no secret file in json format, we write a default one that we will then read in a second iteration of the loop. That way, the user has an example of how such a file must look like. - - Next step is to cleanup the error type and implement the Error trait. - - create config directory, if possible - * Only supports one level of directory - * full error handling, and uses memory efficiently - - infrastructure - * allow usage of cmn.rs for common types (like Error types) - * instantiate an engine and handle errors, in an initial quick and dirty +* Only supports one level of directory +* full error handling, and uses memory efficiently +* allow usage of cmn.rs for common types (like Error types) +* instantiate an engine and handle errors, in an initial quick and dirty way. - - Fixes #52 - - generate complete docopts grammar - Grammar is laid out per method, providing general purpose arguments - only as needed/supported. - - All details will be contained in the markdown documentation. - - Related to #45 - - per-method-markdown-files - That way, all information can be placed within a single markdown file - per method call. This will keep loading times low while maximizing - usability. - - That way, it's comparable to the API documentation, which is most - detailed on a per-method basis as well. - - cli postprocessing support - That way, a single huge markdown file containing documentation for - commands and methods can be split up into multiple files for - individual inclusion in mkdocs. - - It's done by a post-processor which is loaded by mako-render, providing - access to the entire context. Said processor may also drop results - altogether and thus prevent files to be written that have been split up - by it. - - docopt subcommands - Setup command/subcommand pattern. - Next will be the infrastucture for documenting these, using mkdocs - and markdown. - - bin renaming + docopt infrastructure - * allow to rename executables, for now just brute-force using a boolean +* allow to rename executables, for now just brute-force using a boolean flag. If we have more binaries at some point, we might want to be more elaborate. - * everything related to docopts functionality is now in the docopts +* everything related to docopts functionality is now in the docopts module. Related to #45 - - basic usage of docopts - For now we just show it works within our generator. - Next step is to actually generate docopts grammar. - - mkdocs generator works now - It can be selected for each type of program we want to build, and makes - sense for everything that is not a library. - - We also tried to unify names and folders a bit more, even though there - certainly is more work to be done to be fully non-redundant. - - cli depends on API, generically - This allows us to build efficiently. CLI programs can now have their - own cmn.rs implementation, which we can test standalone with - `cargo test`. - - The primary makefile currently just explicitly pulls in the type-*.yaml, - one day we could possibly put it into a loop. - - api generation works once again - With the new structure, it should be easy to add CLI programs with - proper dependencies accordingly. - - Resumable upload implemented - With all bells and whisles. For now, we don't have a good return value - to indicate that the operation was cancelled, which needs fixing. - - implement query_transfer_status() - The delegate logic is implemented and seems sound. - It's somewhat funny that after all this back and forth, all we get - is a valid start position for the upload. - - ContentRange header (parse and format) - Now we are able to send the transfer-update requests and implement the - actual chunk logic. - - use of oauth2::Scheme - That way, we improved our API, reduced code bloat, and are very clear - about the what we do for Authorization. - - crate version + - That way, crate names reveal exact inforamtion about the contained - API revision. - - * crate version: code gen version - * + (build-metadata): exact version of API schema - - check upload size against max-size - - - make actual `store_upload_url()` call - We also assure to call only as often as we have to, keeping some state - between the loops accordingly. - - improved delegate calls - The delegate will be asked for an upload URL, that he may store during - yet another call. - - resumable-upload infrastructure - Layout the `ResumableUploadHelper` and implement the entire logic - with the mbuild renerator. - - All that's left to be done is to implement the 'chunked upload' method. - - The borrow checker helped me to prevent a bug as well. - - don't crash if json decode fails. - Instead, tell the delegate about it and return the error. - - mark unused types with marker trait - For some reason, some google APIs define types they never use. We now - mark them, just because we can, to show our superiority ;) ;) ;) :D . - - support for 'variant' schema - Documentation links, at one spot, have been updated as well. - The variant schema is represented natively as enum, it all looks - very good. - - Json has been taken care of as well ... . - - Option<_> in schema only if needed - This means that only part fields will be optional. - - added field aliases, were needed - This makes sure our fields can properly be decoded. - - use serge instead of serialize - However, for some reason, the `Serialize/Deserialize` macros don't work - for me, even though they work just fine in the respective tests of - the serge crate. What am I possibly doing wrong ? - - simplify delegate calls - Now we use the DefaultDelegate as standin in case there is user-delgate. - That way, we save plenty of complexity as no additional - `if let Some(ref mut dlg) = delegate` is necesary. - - prevent duplicate schema types - These could clash with types we import from Cmn. When that happens, - just a single list must be adjusted for a fix, see - `unique_type_name` - - begin()/finished() calls - During `begin()`, the delegate receives additional information about the - current call, which can be useful for performance tracking, among - other things. - - alt 'media' handling to allow dls - This also includes documentation to state which methods actually support - media download, and how to achieve that. - - Added TODO to not forget we should tell the user how to achieve these - kinds of things. - - crates with 'google-' prefix - - - allow to set user-agent - - - optimizations and simplification; seek - * MultiPartReader is using match to handle state, reducing unnecessary +* crate version: code gen version +* + (build-metadata): exact version of API schema +* MultiPartReader is using match to handle state, reducing unnecessary calls to 0 in that regard. - * Fixed seek() calls on readers, assuring they are reset to start each +* Fixed seek() calls on readers, assuring they are reset to start each time the loop is done. - * both media-parameters now use `ReadSeek` streams. - * Use `seek()` to figure out size, simplifying the interface. - - optimized memory allocation and options - * reserve_exact(X) where possible (params, multi-part-reader) - * `if let` used whereever possible to prevent duplicate checks - - This increases the possible performance, and makes for more readable, - concise code. - - multibytereader single byte test - It shows that we actually don't handle our state correctly. - The first test which reads to string obviously uses a big-enough buffer. - - MultiPartReader is working. - Something that is missing is a single-byte read test - - initial part writing - We are a state-machine, and handle parts of it correctly. - However, we don't yet write the boundary at all, and could improve - our use of match. - - multi-part mime-type and add_parts() - Next we will implement the actual Read method - - handle 'alt' param - It's conditionally set to json, if we expect a response value. - - more multipart infrastructure - * outer frame of `MultiPartReader` to allow using it in `doit()` - * restructured `doit()` to get content-types right - - There is more work to do, as it currently doesn't compile, nor - do we deal with our streams correctly. - - But we are on a good way. - - improve body infrastructure - This will support choosing custom readers at runtime, depending on - whether we have a resumable or simple media upload. - - simplify URL_ENCODE handling - More maintainable template code, with less redundancy. - - uri-template handling complete - We now handle url-encoding for the parameters that would require it, - and can deal with repeated params that will match '/param*'. - - uri-template generation works - This doesn't mean it's correctly implemented, but we are on our way. - It does compile, at least - - repeated types in examples - Made sure usage examples know how to use repeated types. - - repeatable parameters working - The code dealing with them currently assumes they are "/" separated. - - intermed. support for 'methods' - These 'methods' have no resources, and need slightly special handling. - This version at least makes the generator work, even though - it produces duplicates. - - However, as it is so ugly, I'd rather consider to change it - substantially ... this feature should just come naturally. - - partial implementation of url expr - URL expressions allow to substitute values within the URL with - parameters. However, this is not only a simple key-value replacement, - but supports expressions that need a parser. - - This one will have to be implemented next. - - set upload media type - Related to #17 - - add more obvious crate and api version - - - pre-request delegate call. - This one is likely to change the further we advance in the upload-media - implementation. - - json decode and delegation - Now json errors are handled and delegated with the option to retry, - and all other values are just decoded according to plan. - - For now, I brutally unwrap json values assuming this will work, because - it really should work. - - authentication with and without scopes - It's quite rough around the edges, but has a slight chance to work. - Will still to handle return values accordingly. - - attempt to send json-encoded request - This doesn't work yet, as I am unable to unwrap the client properly. - It's a refcell that contains a BorrowMut to a hyper::Client, and - lets just, it's complicated. - - add cargo.toml dependency information - - - docs and tests of youtube3 on travis - This might already bring it close to 7 minutes runtime, which seems - like providing us with a buffer big enough for when it is - feature-complete. - - update-json using discovery API - Instead of depending on the google go client API repository, I now - use the original data source, namely the discovery API. - - full usage example on landing page - Related to #4 - - oauth22 -> oauth2_v2 - Related to #3 - - improved library names - Related to #3 - - new github-pages target - For import of all docs to the github - - now we pre-generate nested schemas - Into a complete, global list of schemas, with additional meta-data. - - However, it's currently not complete, as $refs are missing. - There is some resemblance to to_rust_type(...), which worries me - slightly - - part 1 to implement 'any' type - It is a Json object, with a schema as defined elsewhere. It's quite - cool to see this (nearly) working already. However, it will require - us to transitively assign the required markers which is based - on information we don't currently have. - - Maybe implementing this could also help to simplify name-clash checks - or make them better at least ? - - build all apis, were possible - Now there is a blacklist feature, allowing to list apis we can't yet - handle for whichever reason. - - new Scope enum type - For use in all places where scopes are desired. It will also be made - available for adding scopes by the user. - - scope as property ... - ... however, it will become an enumeration, as I don't like people - putting in strings all by themselves. This also means we have to - generate good enums ourselves. - - query string setup - It works for uploads as well as for others. - - Next up is to setup the head and authentication. It will be as simple - as calling and handling `GetToken`, even though I think that there - needs to be better support for the scope that is asked for ... . - - generic result type - ... and we actually add additional fields to our fields list. - - additional fields and Result type - Now query params are handled completely, including clash check. - Additionally, there is a new result type which encapsulates everything. - - It must be typed to the actual result type though, which can be a - request result - - put all fields onto a list - Also handle the case when the 'part' field is generated from the - request. Additional params still need work - - spike to see how delegate can be work - To avoid an additional type parameter, we will use dynamic dispatch - for the delegate. - - Having function overrides at some point seems like an excercise better - left for version 1.1 ;) - - first attempt to get it to work - With a big but ! The most simple thing to do it was to just add - additional type parameters to the respective method. - - Now the type cannot be inferred, which means type-hints must be added. - This should be easy enough, but ... has to be done somehow. - - media-upload doit() methods - It's just a first step, and even though the generation works well, - I am still missing the right Rust code. Will have to simplify ... - - `param()` to set any parameter - That way, things like drive.files.insert alt=media has a chance to work. - We should actually check for this to support various 'alt' values - - added gogole drive API - Just to have another, different set of api information to deal with, - and not accidentally hard-code things to work with youtube only. - - Prepared dealing with media uploads, and it turns out to be best to - adjust the 'doit()' to take the respective type parameter. - - We also have to think about downloads, like the ones for google drive, - which requires custom query parameters. - - ground work for upload media - This might mean we need additional type parameters, but I will see how - it's going to work out. - - In theory, we could define a new trait for Seek+Read, but this would - mean that we couldn't contain owned streams. - - For max flexibility, it's better to have additional type parameters - and use BorrowMut to allow ownership, and borrow. - - request type handling part 1 - Now we will generate proper resoure methods builder calls to instaniate - the more or less valid method builders. - - However, it doesn't compile yet, and the 'to_parts()' method on - resources is still missing. - - build insert/update ... methods - It's just the first version which defaults everything. - Required parameter lists still have to be built. - - It's not going to be a problem at all. - - properties and setters for mbuilder - This includes descriptions, of course, and generally seems to look - quite neat. For now, we brutally consume all input to own it, - but in future we might be able to put in Borrow to support them all. - - infrastructure for method builders - Now comes the actual work of setting them up. - Additionally, the docs were decluttered to show comments only - were necessary. Now the code path to getting the hub is as concise as - possible. - - Partial MethodBuilder impl - Including documentation at least on the method builder part. The - great thing is that fully working examples are now included on - every type ! - - Now more involved part starts ... namely setting up the individual call - method signatures. - - defs are now more readable - This works with a new `indent` and `unindent` filters respectively. - There are a few things to consider, but I have understood how it works - and can handle it. - There is some overhead just to give me nicer visuals ... might choose - a different route, like annotations. - - generate hub implementation and docs - This includes docs for the library usage. - It's totally great to be able to paste example code right were it - belongs, and also put the same elsewhere to compose more complex docs. - - def for DO NOT EDIT comments - A note like that is now added to all files we generated, commented out - depending on the file type. - - Quite neat, except that for filtering, I always have to use blocks. - - Traits now show up as part of lib - Previously, they were in an extra, oddly named crate. - Now we just make it a part of our generated codebase. - - That way, traits, and common code, shows up as part of the library. - Fair enough. - - This also means that the types ar not reusable. - Maybe a mixed-mode can be used if that is desired. - - add marker traits to schema types - Based on their involvement in activities. - It nearly works perfectly. - - LUTs and context to make better docs - Now a context is passed to utility functions, which contains the state - these may be interested in. This keeps it clean from global state. - - With the lookup tables, it is possible to figure out relations between - types and document them accordingly. - - first generated result ... - ... just to keep track on how it changes over time. - - generating valid rust from schemas - It's very nice, even though there is some more work to be done here. - It's just the beginning ... . - - now sets up entire project structure - That way, we have a common library to pull in from the main repository, - and a space for testing new code (in a partial implementation). - - Next there will be generated object structures. - - improved license information - ... and readme, and looks of author listing. - Slowly getting into the flow, possibilities seem thrilling. - - LICENSE + README.md - Readme is very initial, but the architecture is set to evolve it to - something no less than beatiful. - - mako-render generates output dirs - That way, the makefile doesn't need to know that much anymore, and - gets simpler/less verbose. - - \# Also - * Added filters for rust doc string - * fixed .PHONY - - apis target - make all apis - - can now use custom libraries in pycode - Namespaces can exclusively be used during rendering, which is fine if - you remind yourself of the newline rules. - However, I also need some utiltiies that convert input data. These - are now within their own libraries, which can be used from python blocks - like the ordinary python functions they are. - - Quite neat. - In future, most of the functionality will be in separate namespaces, - the top-level will just assemble the main library file, usnig the - provided %defs. That way, the main file is kept clean. - - cargo.toml template - It's quite final, and super easy to change and to read. - - It seems we want to use namespaces/shared implementations soon to allow - using defs. In our case, we transform the version in a particular way, - which is easy enough, yet I'd like to use it to make the system more - powerful. - - generic source/output mappings - This includes proper handling of dependencies. - The code is concise, pythonic and quite 'cody', but does the job just - fine. - - multiple input-outputs per call - That way, we read the data files only once, but produce all the outputs - we need. Together with a powerful makefile, we have a multi-invocation - with proper depedency tracking. - Everything will be regenerated though, even though just a single input - template file changed. - - The alternative would be to have one dependency and invocation per - input dependency, but that will read the entire json each time. - - Let's see what's faster/more useful during development. - - api deps generation works - It's very pleasant to use, and worth the slightly greater effort. - - mako autosetup and improved executable - Now we can write mako templates, with a similar feature set as - pyratemp. Except that its syntax is nicer, allows to do everything - and that there is syntax highlight support. - - Let's see how it fares - - successfully generating make deps - After minor modifications to pyratemp, it certainly does the job. - - What it **does NOT** do: - - * multiple outputs per template/command invocation - * NICE embedding of code (like GSL can) - - It will do the job nonetheless, but mako might be worth a look - - my first gsl program ... - And it crashes on linux and on osx. - What am I doing wrong ? - - unified make based build system - Added all prerequisite programs in binary for easier use. - Make is now implemented top-level, and is not expected to do too much - work actually. It will, however, keep track of all required - gsl invocation and make sure calls are efficient by not having - to rebuild everything every time. That's what make does, anyway ;) - - added authenticator arg - That will allow interaction between client and authentication attempts. - It also shows how cumbersome it is to deal with all these - generics ... but hey, you gotta do what you gotta do. - - If boxes of pointers would be used, it would be easier to handle, but - enforces a certain memory model. That, of course, is not desired. - - makefile for handling json-to-xml - That way, it will remain clearly documented how to do this, and allow - for efficient calling of gsl as well, at some point. - - Of course it will be a little more difficult for us to know all - dependencies, but gsl could generate these as well for us, I suppose. - - add conversion tool and youtube api - The json file needs to be converted to valid XML, which should be - done by a soon-to-be-modified xml2json tool. - - first primitive types and api - Now it should be possible to implement first version of actual - insert handling, with everything there is about it. - - That should eventually help to generalize it, as I am definitely - not going to hand-implemented these protocols ... . - - The great thing is, that if done right, one will be able to truly be - first and make an impact ! - - improved module layout - As there will be plenty of types, it will be better to split it up. - Also learned something about self:: :). - - Insert and and update should be hand-implemented just to see how it's - working. Then there should be some investment to auto-generate this - with `gsl`. Once the latter works ... I could auto-generate all apis, - or adjust the go generator to create rust instead. - - Depends on what will be faster ... . - - figure out ownership model - There is a central `YouTube` type which helps constructing various - sub-builders, which in turn provide individual functions. - - Architecturally, it's very similar to the go implementation, but - more efficient memory wise. - - initial commit - Base project with everything it will need to grow: - * CI - * documentation - * basic cargo +* both media-parameters now use `ReadSeek` streams. +* Use `seek()` to figure out size, simplifying the interface. +* reserve_exact(X) where possible (params, multi-part-reader) +* `if let` used whereever possible to prevent duplicate checks +* outer frame of `MultiPartReader` to allow using it in `doit()` +* restructured `doit()` to get content-types right +* Added filters for rust doc string +* fixed .PHONY +* multiple outputs per template/command invocation +* NICE embedding of code (like GSL can) +* CI +* documentation +* basic cargo ### Bug Fixes + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + + - teach remove_json_null_values arrays change `remove_json_null_values()` to properly remove nulls from and recurse in to arrays google_firestore1_beta1's `CommitRequest` contains an array of `Write` objects which can ultimately @@ -1283,964 +1214,137 @@ you need. Summary of changes: - Converted from using span + br tags for formatting to using tables - - Added Bootstrap stylesheet - - Refactored a lot of the logic which was being done in the html ${...} +- Added Bootstrap stylesheet +- Refactored a lot of the logic which was being done in the html ${...} tags out into a block which gets run at the start of each api version. (hopefully this will make the template easier to maintain in the long run) - - Possible issue: - - I swapped from looping over each key in `tc.keys()` to assuming the keys - will only ever be ["api", "cli"]. This hard codes the keys instead of - getting them dynamically, but makes it easier to format as a table and - lets you pull a lot of the logic out of the template and into a single - block before each table row. - - If the types of application in `tc.keys()` ever changes then this - template will need to be updated accordingly! - - use new serde map implementation - No fun, this one. - - build better data - Really just what is needed right now to make it work. - - [skip ci] - - make cli publishing work - It really needs allow-dirty. - Let's hope that won't publish too much. - - try to depend on major version of api - Previously that didn't work due to a bug in carg, - but should work now. - - cli + api use a single base version - That way we get rid of the duplication at least. - Probably it would be enough to just refer to version 1 of the - library respectively, and let semver do the rest. - - correct link to license on github - [skip ci] - - handle discovery urls with $ - Some google discovery URLs contain `$discovery` or other variants, - causing the calls to wget to interpret `$d` as an environment variable - instead of a literal. An example is: - `https://logging.googleapis.com/$discovery/rest?version=v2`. - - To fix this, the `$` has been escaped so that wget fetches the URL as - expected. - - Add an unused field to empty API types. - Null structs (struct Foo;) cause the following error when trying to - deserialize an empty JSON object `{}` into them: - - `JsonDecodeError("{}\n", Syntax(InvalidType(Map), 1, 1))` (also known as - `invalid type: map at line 1 column 1: {}`). The optional struct member - prevents this error. - - URL-encoding '/' in URLs is not accepted by Google APIs. - - use redirect flow - The interactive flow requires to paste a code back into the - command-line, which does only work when it's cat'ed, but not - if it is pasted. - - This should let it handle everything internally, which is - way more user-friendly. - - relative path for custom target dir - Using a shared target-dir is important to keep - disk-space usage in check and speed up builds. - - don't fail by default on non-nightly - - use working version of serde-codegen - This update fixes the build on stable, and allows builds - on nightly as usual. - - The trick is to use the latest version of serde-codegen, - which keeps the syntex version internal, preventing clashes - between libraries that might have different requirements. - - as learned from yup-oauth - That way, there is no redudancny anymore. - - work with latest serde - `cargo test` will work now. - We now use the latest serde once again, which should - make everything better. - - remove cargo/config - It seems due to a so far possibly unfiled bug, cargo fails to - get it's CWDs right. - - Last verified with cargo 0.11.0-nightly (42bce5c 2016-05-17). - - To reproduce, just put the deleted file back and run a build command, - such as - - ```bash - make drive3-cli-cargo ARGS=build --no-default-features --features=nightly - ``` - - use hyper Bearer header style - Considering we kind-of hardcoded this authentication type anyway, - we now use the Auth-types provided by hyper 0.8. - - The incentive here was the compiler telling us that there the - yup-oauth::Scheme type doesn't implement the hyper::authorization::Scheme - anymore, even though that clearly was the case. Also it couldn't be - reproduced in yup-oauth itself. - - This will need some work to get correct again, so this is just a crude - patch to make it work again. - - compatibility with serde 0.6 - 0.7 has a weird assertion error that might have happened - if files get too large. - - choose serde-version which works - Everything newer than the ones we see here will cause - the error described in #148. - - use venv-python to run any utility - Previously the yaml version generation could fail if your system-python - didn't have yaml installed. Now the virtual env is used, which is - guaranteed to support yaml. - - use latest oauth2 lib - It enables using std::time::Duration natively - - use new discoveryRestUrl field for json download - - use std::Thread::sleep - However, in sibling libraries, we still use time::Duration, which - now is a part of std::time::Duration. - These should be adjusted, to make the usage of - sleep(Duration::from_millis(d.num_milliseconds() as u64)) into sleep(d) - - improve handling of error code if stable is tested - - get cmn compiling on nightly rust - - - assure license can be generated - - use PYTHONPATH for mako invocation - That way, it will find its resources. - - improve version and library name handling - We can now deal with versions having the 'alpha' or 'beta' suffix. - It's rather hard-coded, but solves the problem for now. - - Related to #126 - - update to latest serde/rust - - update to serde 0.5.0 - Serde move all json code into a separate crate that we are now using - as well. - - use clap 1.0.3 - * `SubCommand::new(...)` was renamed to `SubCommand::with_name(...)` +* `SubCommand::new(...)` was renamed to `SubCommand::with_name(...)` which actually is now consistent with everything else (e.g. `Arg::with_name(...)`) - - compatibility with hyper 0.6.4 - * Signature of `client::Response` changed and now requires a +* Signature of `client::Response` changed and now requires a `hyper::Url` as well. - - Closes #123 - - adjust linux script to target dir - Previously it attempted to find build-artifacts in - the 'gen' directory, now these are all found in - 'target', provided cargo 0.3.0 is used. - - [skip ci] - - flush output stream on CLI output - For some reason, this is now a requirement - previously this didn't - seem to be necessary. - - Don't know what changed there ... and it's odd it doesn't flush - when the process is going down or the handle is destroyed. - - work with hyper v0.6.0 - Currently the latter actually fails to link on OSX, and requires a local - override with [this fix](https://goo.gl/OTExmN). - - type-inference fails on empty vec - Previously this wasn't the case, as the type could be inferred by the - type of the parent-vector to extend. - - Apparently this feature was removed, probably for good reason. - - make statement shell compatible - The previous one actually required bash, instead of sh - - add type annotation - It seems to be required when building with an older rustc version. - This did work in nightly, and just seems to be some sort of limiation - in stable. - - work on stable - CLI was slightly adjusted to not use unstable features. - Fortunately, there is no serde magic happening, which allows - us to keep it simple without using a build script. - - minor fixes - * Mime crate must be used in the same version hyper uses - * made attempted move a borrow - - expanded header implementation - Now it compiles to the point where `Mime` appears as duplicate type, - for some reason. - - first big step towards syntex - Even though there is a bug that caues {} to be used in stead of - (), - when exanding macros, which causes syntax errors that we have to - workaround, it's not a real issue. - - What's happening additionally is missing hyper macros, which - now have to be expanded manually. Shouldn't be a problem, - pretty-printing when compiling is made for just that ;). - - No, it's sad that `include!()` works so badly, it makes - using serde so difficult ... it's no fun i must say. - - Just for stable ... I am not sure if it is worth it." - - clean was depending on unknown targets - There are no per-program-type docs clean, just made it depend on - docs-all-clean. - - Also added the `docs-api|cli` target to the generated per-program-type - make help. It was just missing, even though it existed. - - fix clean target for docs/cli - clean-all-docs and clean-all-cli aren't valid targets. The current mako - template causes `make clean` to abend reporting that it can't make these - targets. - - URL substitution handling - Previously we would remove the wrong parameters when attempting to - remove only those parameters that have been used in the URL - substitution. - - The code we have now is more idiomatic and appears to be removing the - correct parameters. - - dc630d01e 2015-05-09 - * Vec::add was removed ... which forces me to write 4 lines instead of +* Mime crate must be used in the same version hyper uses +* made attempted move a borrow +* Vec::add was removed ... which forces me to write 4 lines instead of one very readable one :(. Not everything is to the better here, even though I can imagine they did it to prevent people from thinking this is a cheap operation. - - [skip ci] - - deal with rustc lifetime issue - Related to #109 - - limit tar.gz to executable - Previously it could re-pack tar-files and mess everything up. - - [skip ci] - - osx-tar files without directory - Previously, they contained the parent directory, which wasn't intended - and was different from the plain-layout dictated by the linux version - of the script. - - [skip ci] - - filter null values of requrest structs - Some servers, like youtube, reject null values possibly thanks to - the reliance on parts. Now we are filtering them (in a very inefficient, - but working way), which seems to be fine with the servers. - - Effectively, we seem to be able now to upload videos ... . - - More testing required ! - - upgrade to hyper v0.4.0 - It was basically just a find-and-replace to adapt to the changed names - of Error and Result types. - - completed list of parameter names - Previously the 'did-you-mean' functionality only knew the global - paramters, but not the method-local ones. - - simplified call to form_urlencode - It now supports more generic inputs, as suggested in a lenghty - dialog on a corresponding github issue. - - Required to build with >=0.2.33 - - added latest reference CLI code - Just to have something to link to - - gate usage of `upload_media_params` - Previously the local stack variable would be used even though it - wasn't initialized as there were no upload flags. Now this only - happens if there are media params. - - [skip ci] - - let delegate forget uploaded urls - When uploading using the resumable protocol, we are now telling the - delegate to forget the previously stored URL after successful upload. - Previously it would have tried to return such a URL and thus made - the system retry uploading a file that was already uploaded. - - handle repeated required strings - In a single case we wouldn't properly pass on string arguments that - were repeated. Now we handle them with a nice one-liner. - - 'about()' text for main commands - It shows up in the help, and makes it easier to navigate the command - tree without bringing up the html documentation. - - adjust `JsonTokenStorage` to yup-oauth - Signature of `set()` changed to return a `Result<(), _>` instead of - an `Option<_>`. - - Related to https://github.com/Byron/yup-oauth2/issues/5 - [skip ci] - - unified error handling - * Use `Result` everywhere, instead of Option or tuples - * Properly handle error occurring after the dry-run. We do it in an +* Use `Result` everywhere, instead of Option or tuples +* Properly handle error occurring after the dry-run. We do it in an extensible way, in case we need to do more than handle invalid output files at some point. Output files that could not be opened will now result in a nice error message with all the information we have. - - escape subcommand descriptions - Otherwise, we could have had invalid rust strings. - - [skip ci] - - remove unused std_misc feature - Hopefully this will not trigger errors elsewhere, but we will - just find out I guess ;) - - adjust to latest hyper header macros - - re-introduce UploadProtocol,fix CallType - * CallType now represents either Upload or Standard calls, whereas +* CallType now represents either Upload or Standard calls, whereas the Upload variant is represented by the UploadProtocol enum. That way it's clear what happens, and we don't mix orthogonal concepts in one enumeration just for convenience. - - All tested APIs seem to build, verified - - * upload - * download - * request structures - * parameters - * scopes - * config-dir - * debug[-auth] - - update docs and fix calltype handling - * mkdoc docs grammar is now hierarchical, making the command structure +* upload +* download +* request structures +* parameters +* scopes +* config-dir +* debug[-auth] +* mkdoc docs grammar is now hierarchical, making the command structure more obvious and easier to understand. It's a nice addition to the auto-generated, hierachical usage of clap. - * UploadProtocol enum is now CallType, to ease handling the different +* UploadProtocol enum is now CallType, to ease handling the different ways the Call has to be executed. It looks quite clean, even though combining upload protocols and the calltype is a bit hacky. - - various fixes and improvements - * `--version` now includes the API revision we embody +* `--version` now includes the API revision we embody (using crate_version()) - * Allow multiple scopes to be specified, instead of just one. Previously +* Allow multiple scopes to be specified, instead of just one. Previously this was problemantic due to argument parsing of docopt being greedy. However, this also means we have to specify the `-r` flag for each invocation. See https://github.com/kbknapp/clap-rs/issues/89 . - * Adapted to new signature of `Arg::possible_values()` and used the +* Adapted to new signature of `Arg::possible_values()` and used the previously orphaned `UploadProtocol` enum. - * Deduplicated code a little by adding the new `opt_values()` generator +* Deduplicated code a little by adding the new `opt_values()` generator function. - - Related to #81 - - print usage if command is missing - Also, fixed config-dir substitution in flag's help message - - tweaks to make youtube3 work - Mainly minor cleanup, and handling of generator branches that - didn't show up in smaller APIs that were used during the first steps. - - related to #81 - - adjust option usage to changed API - Discovery API now builds and seems to work even ! More testing - will have to be done though to be sure. - - handle apis without media upload - We are annotating the type of the optional protocols if that shall be - required. - - call `iter()` directly - As IntoIter is only implemented for slices up a sice of 32. - DFAReporting though will reach 55, at least. - - Also added dfareporting-cli code to show how stackoverflow issues can be - circumvented efficiently. - - commit before un-using UploadProtocol - We will try to wait for https://github.com/kbknapp/clap-rs/issues/87 - to allow us to use the enumeration instead of strings, as well as - an iterator, which will look more idiomatic in the end. - - generate command data structure - We do this in the hopes to circumvent a stack overflow. - This means we will setup the parser entirely at runtime, which actually - saves a little bit of code. - - upload some code to help debugging - We get a stack-overflow when trying to run the dfa-reporting program, - and right now I don't know how to workaround it. - - This could be preventing us from using clap. - - make it work with latest hyper - This is known to work with the master of hyper. It's probably OK - to keep it, preparing for the next release and under the assupmtion - that I will not be releasing binaries for a while. - - exclude cloudsearch from build - It doesn't have a single method, and thus is useless - - code updated to v0.1.6, latest CLI - * also includes publishing tag files - - CLI + API release preps - - update changed `url` crate imports - - request value cursor handling and docs - * now the cursor will only be set permanently if the -r flag is used in +* also includes publishing tag files +* now the cursor will only be set permanently if the -r flag is used in 'cursor' mode. In 'cursor=value' mode, the cursor change doesn't persist among the flags. That way, one can easily distinguish between setting the cursor, and setting a field. However, '...sublevel.level=value' will still work as it did previously, yet the cursor change will not persist. - * Documentation was adjusted to represent the new cursor style. - - simple and resumable upload works - * fixed boundary syntax of multi-part message. Was --BOUNDARY, now is +* Documentation was adjusted to represent the new cursor style. +* fixed boundary syntax of multi-part message. Was --BOUNDARY, now is --BOUNDARY-- - * Fixed ContentRange parsing and serialization. We actually managed +* Fixed ContentRange parsing and serialization. We actually managed to break it last time we tried to update it to match the Go implementation. - * fixed uploadType header parameter. It's based on chosen protocol and +* fixed uploadType header parameter. It's based on chosen protocol and whether or not the method supports multipart operation for the given protocol. - - Related to #76 - - use only one request structure - This works as we can just put all request-structure parsing to the top - of the function. - That way, we don't put the request struture twice. - - set request value to call - Previously, even though the request was passed by reference, it was - copied and thus our changes never arrived in the call. - - Now the API makes this clear by taking ownership, and the CLI code - sets the Request value lateron, explicitly. - - Related to #76 - - verified download works - * implement custom scopes - previously they could be set, but were +* implement custom scopes - previously they could be set, but were ignored during the API call - * api-overrides are not yaml files for convenience. Existing ones were +* api-overrides are not yaml files for convenience. Existing ones were updated as needed. - - update all code to latest version - * add new APIs - * remove old ones - * add latest json files - - response value json decoding - * updated all json API descriptions - * enabled 'pretty' printing of response structures. However, currently +* add new APIs +* remove old ones +* add latest json files +* updated all json API descriptions +* enabled 'pretty' printing of response structures. However, currently there is no way to get rid of all the NULL fields without external filtering - * all structure fields are now optional - there seems to be no way +* all structure fields are now optional - there seems to be no way around it. - - implement deletion of tokens - Previously this case was entirely uncovered. - Interesting note: when a token is revoked, existing auth-tokens will - still work. However, you may not refresh them in case permissions - have been revoked. It's good as there is only one code-path to deal - with (and we verified it to be working), and bad for the user as - malicious software can keep using an account for certain time until - the token expires. - - adapt to changed yup-oauth2 API - The latter changed a lot, to the better, and we handle the new - return types accordingly. - - Related to #74 - - resolve generator issues - * exclude dataflow API - it doesn't have a single method as long as +* exclude dataflow API - it doesn't have a single method as long as it's in B4. See https://github.com/Byron/google-apis-rs/issues/78 - * assure ARRAY branch can be hit - - update make target - Also, generate CLI. Probably there is not enough time to build it. - - README info + fix author email - Please note that docker build script is still in debug mode, this - issue will remind me about it: #72 - - scopes were used illegally - Which caused a compile error. This was fixed by assuring the code - uses the same function to determine whether or not scopes are present - per method. - - [skip ci] - - (abf0548b5 2015-04-15) (built 2015-04-15) - - latest version of all APIs - Now CLI and API and the same level - - request value parsing compiles and inits - Therefore, you can now actually make the call and the program will not - crash due to uninitialized Options. - - struct access compiles ... - ... but currently wouldn't run as we don't initialize the optional sub- - structures at all. - - corrected cursor handling in mkdocs - The trick was to use an actual list of cursor tokens that is consumed - on use. That way, we don't loose track of were we are in the - structure. - - Related to #64 - - NULL default values instead of randoms - Instead of generating nonesense random values, we just map defaults - that represent the respective NULL value of a given type. - - alt-media handling in CLI+API-docs - * API-docs now adjust depending on where 'alt' is set (either as global +* assure ARRAY branch can be hit +* API-docs now adjust depending on where 'alt' is set (either as global parameter, or as method-parameter) - * CLI: download tracking now works for 'alt' as method-parameter - * CLI: global parameter remapping allows them to be named consistently, +* CLI: download tracking now works for 'alt' as method-parameter +* CLI: global parameter remapping allows them to be named consistently, but map to the name required by the google API. - - optional parameter default handling - Now we provide a matching default for each paramter, thus alleviating - the previous issue of unncecessary follow-up errors. - - add rustc_serialize to test-crate - A top-level `cargo test` didn't work anymore thanks to a missing - mention of rustc_serialize. - - [skip ci] - - optimze argument handling and conversion - * Thanks to a generic function, we save a lot of code within main.rs - * more effcient signature for ParseError - - Display for Errors + refactor - * refactored errors into a hierarchy - * implemented `Display` trait for all error types, including some +* Thanks to a generic function, we save a lot of code within main.rs +* more effcient signature for ParseError +* refactored errors into a hierarchy +* implemented `Display` trait for all error types, including some 'hierarchy-aware' printing. - - improved scope handling; fix CLI - * in APIs, scopes will now be per-method, and if no scope is given, +* in APIs, scopes will now be per-method, and if no scope is given, we will assume only the API key has to be set. Previously there was a wild mix between globally mentioned scopes and method scopes. - * assure CLI generation works so far, for all avaialable APIs - - Related to #48 - - add commands.yml.mako - It was previously hidden thanks to .gitignore. - - Good I made a fresh clone to see if make really really works. - - dependencies are now per-program-type - Previously we put cli.py into the common lib folder, which caused the - API to be regenerated and rebuilt whenever we changed code that will - only affect the CLI, causing terrible turnaround times. - - Now the dependency is fixed. - - 'bytes ...' -> 'bytes=...' - * update all APIs to contain said change. It's not worth a republish +* assure CLI generation works so far, for all avaialable APIs +* update all APIs to contain said change. It's not worth a republish though. - - better subtext + rename target - * catchier title for dev diary episode 1 - * fixed target name for clean, which was 'clean-api', but should have +* catchier title for dev diary episode 1 +* fixed target name for clean, which was 'clean-api', but should have been 'clean-all-api' - - one folder per API docs - Otherwise, it would overwrite its search index, effectively breaking - the search field. - - We might run into space issues on github, as the generated docs are - duplicating each other and use a lot of disk-space. - - use bytes=... when sending as well - Previously, `bytes=` was just parsed, but not sent to the server. - This change is motivated by a similar change in this commit: - http://goo.gl/AvyvLb - - fix dependencies - That way, we don't build documentation unless this is truly necessary - - add publish state v0.1.5 - - corrected absolute links - This only affected links in readme files, not the relative ones - in the actual documentation - - v0.1.5 - * fix documentation link in Cargo.toml - * adjust to latest hyper. It's not even out yet, but people +* fix documentation link in Cargo.toml +* adjust to latest hyper. It's not even out yet, but people can't build the APIs anyway. - - adjust to hyper client - * deal with hyper client not using a type-parameter anymore - * fix incorrect documentation link (use '_' instead of '-') - - v0.1.4 - * added crate publish tag files - - adjust invalid make target - * `docs` is `docs-all` now. On travis, this should only build one API - - v0.1.4 - * macro 'alias' was renamed to 'rename' - * fixed `cargo test` on main project - - The latter pointed me to the serde issue, which would have made - everything fail when actually used to communicate with google servers. - - v0.1.3 - * keywords are no longer than 20 characters, which is a restriction +* deal with hyper client not using a type-parameter anymore +* fix incorrect documentation link (use '_' instead of '-') +* added crate publish tag files +* `docs` is `docs-all` now. On travis, this should only build one API +* macro 'alias' was renamed to 'rename' +* fixed `cargo test` on main project +* keywords are no longer than 20 characters, which is a restriction cargo imposes - * don't use 'homepage' link in cargo.toml unless the homepage is +* don't use 'homepage' link in cargo.toml unless the homepage is non-empty - * Added all publish-results to mark the respective crate version - - Related to #46 - - version 0.1.3 - * builds with latest beta/nightly - - rustc (be9bd7c93 2015-04-05) - * using std::convert - * update to latest hyper (and other dependencies) - - Related to #46 - - github-pages index generation - Previously, we forgot to pull in the new type-specific dataset, which - caused the index.html.mako file to fail. - - check-in of latest sources - This also includes crate files to remember which - crates we have published already. - - Related to #44 - - set the API version to 0.1.2 - - incl. `Result` conform to standards - Related to #44 - - remove newlines interpreted as test - When documenting mandatory arguments of a methods builder, it was - possible to document 'parts', which have a long multi-line description - with empty lines inbetween. This caused the indentation to be funny - and tricked rustdoc into thinking these are indented doc-tests. - - Now we remove these empty lines, also hoping we dont encounter lines - with just whitespace in them. The latter would require a tiny update - of our regex. - - remove custom Result Enum - Instead, we just use a customized `Result` tyoe and thus stick to - common Rust conventions. - - update json files from discovery API - - typo fixes and misc. improvements - - - whitespace and trait rename - * `ResourceMethodsBuilder` -> `MethodsBuilder`. This is now precise +* Added all publish-results to mark the respective crate version +* builds with latest beta/nightly +* using std::convert +* update to latest hyper (and other dependencies) +* `ResourceMethodsBuilder` -> `MethodsBuilder`. This is now precise enough. Previously it was just to similar to what's now a `CallBuilder` - * Fixed whitespace issue in `doit()` - - upload size now taken properly - Previously, it would query the size from the wrong dict and obtain - the value 0 all the time. This would have made every upload fail with - `UploadSizeLimitExeeded`. - Now we obtain the actual size limit, and will ignore it if unset/0 - for some reason. - - Patch += 1 - - 0.1.0 release - * Added all APIs to source control - * upped crate version - - upload() return value handling - Now deals with Cancellation and non-OK status codes correctly. - - re-export types used by delegate - Otherwise, delegate implementation might not actually be possible. - - better introduction and version handling - Make it cristal clear what the crate version means, and what version of - the documentation you are looking at. Also do this in the README file. - - Assure that 'Google' is capitalized. - - repository/source-code link - Previously it pointed to a timestamp file. Unified repository - source code link generation, and simplified 'deps.mako'. - - Related to #38 - - simplification and cleanup - * renamed `*MethodsBuilder` type to `*Methods` type - * renamed `*CallBuilder` type to `*Call` type - * greatly simplified `doit()` signature if uploads are involved - * pass `auth` to upload helper - - schema_markers() accessed map incorrectly - - prune unused and ToParts trait - * do not emit unused types. Sometimes though, rustc doesn't seem to +* Fixed whitespace issue in `doit()` +* Added all APIs to source control +* upped crate version +* renamed `*MethodsBuilder` type to `*Methods` type +* renamed `*CallBuilder` type to `*Call` type +* greatly simplified `doit()` signature if uploads are involved +* pass `auth` to upload helper +* do not emit unused types. Sometimes though, rustc doesn't seem to detect that attributses are actually used - * ToParts trait is used and implemented only when needed. - - Linters are back to 'normal'. - - pretty names for methods and resources - Previously, it was possible for methods to have '.' which showed up - in the documentation. Now these are replaced with ' '. - - exclude those with recursive schemas - They currently don't compile as Box 'serde' is not supported. - See https://github.com/erickt/rust-serde/issues/45. - - Related to #34. - - make recursive types possible - Must be `Option>` now, whereas a simple `Box` worked - previously. Anyway, serde can't decode/encode Boxes yet, so - plus1 was removed from the list of APIs to build. - - Related to #34 - - MethodBuilder -> CallBuilder - Find-and-replace. It seems to build and work correctly, still - - improved markdown for library overview - And names of free methods, which previously contained '.'. These are - now spaces. - - just add latest youtube code - It's good to see what actually changed in the json realm. - - Vec/HashMap are Optionals - That assures that we can decode partial server responses, and send - partial structures as well. - - serde cleanup;JsonError pub fields - - prevent type-clash with `Result` - This should have been fixed in previous commit, but wasn't. - Actually a change that fixed one API, broke the other, and vice-versa. - - It's kind of a hack, because it's tailored to fix particular cases only, - yet I believe it's contained enough to say 'good enough'. - - some links pointed to old doc name - With one of the recent changes, the crate name was changed to be - different from the library name. However, there were still plenty of - places that would refer to the library name instead of the new crate - name. - - That way, links in the README.md as well as index/index.html still - pointed to the old location. - - MultiPartReader test case - Simple fixes, required as its API changed - - MultiPartReader now works correctly - The state-handling was incorrect, causing it to not handle small reads - correctly. - However, this is working nicely now. - - fix lifetime issues - Those were totally real, actually, and I am happy the borrow checker - exists ! - - Only one weirdness happened with RefCell>, but it could be - fixed by checking actual types using `let foo: () = something_nasty()`. - - repeated params string addition - It seems odd that String + &str is required. - In any way, previously it would try to add i32 to a string. - - repeated parameters docs improvement - Previously, it said it would 'set' the property, which is not the case - after all. - - regenerate .api.deps less often - It took too long to do it, so the 'MAKO_LIB_FILES' dependency was - removed. It can be re-added if needed. - - decent solution for free methods - Now I just add a 'virtual' resource, which is called 'methods'. - The good thing about this is that it works well standalone, or - in conjunction with actual resources. - - Also the system works with it just with minimal changes - - unit-tests work once again - Added missing Result cases to make match exhaustive - - remove BorrowMut until it's cleared - See stackoverflow at http://goo.gl/f27zJkj. - - Now we can actually call out client and move on with handling the result - - user lower-case library names,always - - - force python2.7 in virtualenv - force the usage of python2.7 on systems where /usr/bin/python points to python3.x - - fixes issue #12 - - incorrectly capitalized cargo.toml - This caused cargo on a case-sensitive file-system not to find the - cargo file, which made it to look upwards in the directory structure - to find the correctly named Cargo.toml fo the 'cmn' development - project. - - explicit subshell for cargo-doc - Previously, it was only executing for cargo $(ARGS) - - try using a subshell for cargo cmd - Apparently travis doesn't execute cargo in the right sub-directory. - Might be a difference in the way make works - - Related to #8 - - fixed dependency to wrong target - Which caused the cmn.rs to be missing, and the build to fail. - - install virtualenv automatically - The only dependency we really have is python, and wget. - Pip is not needed ! - - fully qualified activity names - - - Do not generate docs ! - Previously, travis would continuously overwrite my combined docs with - the ones from the dev-project, and make them useless. - - This has been driving me nuts ! Good to have it fixed ! - - added milestone link - It's important to the project, and should thus be listed there - - use function to make links correctly - It will automatically handle rust-doc mode, and use relative or absolute - links respectively. - - assured it handles '0' correctly - Previously, it could possibly take a '0' away from the start of a - version. Now this is definitely not possible anymore. - - make 'regen-apis' work - Thanks to changes in mako libraries, it won't work anymore without - the template directory set - - typo - - fix incorrect nested type names - There was a name-duplication which led to un-inmplemented types. - - The good thing is that this was the last issue that kept all 72 - APIs from compiling. - - finally, we pick up all types - HashMap types were missing previously, but now it seems to be picked - up quite nicely. - Would this mean we do the type-setup correctly, everywhere ? - - transitive, minimal traits for types - Previously, I would just assign all useful traits to all types, no - matter on how they were actually used. - Now it builds all dependnecies and considers them when assigning - traits, which is as precise as we need it. - - This is important to us as the `Json` type is just encodable, but - not decodable. Fortunately, we just have to encode it, but in theory - this makes it hard to embed any json in a known structure. - - no unused types anymore - Due to shared global state which was altered, we got wrong results. - This is fixed now, thanks to a deepcopy. Amazing, how altering global - state is always biting you, even though you are convinced it's safe - to do in just this case ! - General rule: Just don't do it, no matter what ! - - improved camelCasing - Previously, it was possible to get types like Foo_bar, which is not - desireable. - Now it is totally impossible to see such blasphemy ;) - - protect from nested-type-clash - It was possible for a nested type to be generated with a name that in - fact CLASHED with an existing schema type. What are the odds ! - - The clash-check added will just verify against clashes with schema - types, which seems to be doing it for now. - - nested type names are consistent now - At least so it appears. - The implementation doesn't look totally clean to me, as it seems - similar concerns are in different portions of the code, which was - merely tuned to work together. - - It could break appart if someone - me - wants to change it sometime - - scope -> add_scope - This is not only more precisely saying what it does, but also doesn't - clash with scope parameters on resources ;) (happened in dfareporting) - - improved nested array type handling - It needs deduplication though, coming up next - - prevent struct recursion issue - This works by just boxing types which are nested within themselves, - breaking the recursion. - - nicer code and identifiers - - nested types work for arrays - Thanks to removed code which made no sense to me, I put in a bug. - Now the code is back, beta than ever, and documented as well :). - - now deals with non-objects - These are arrays or HashMaps, which are nested types too. This is used - to have custom types of standard vectors or hashmaps, which resolve - to NewTypes in Rust. - - optionals are working once again - A bug was introduced which caused nested-types not to be optional - in situations were they should. - - nested type resolution and hashes - It seems we do it better than the actual Go implementation, which fails - to detect that scopes are actually having a string member. - - However, there still is an issue, as it's as hashmap for us, but just - a member for go ... lets see ... - https://developers.google.com/discovery/v1/reference/apis#resource - shows that we implement it correctly :) !! - - remove compiler warnings. - Also, a build issue was fixed when schemas were no objects. - However, I think I will have to check this one anyway - - no compiler warnings - This involves disabling the dead-code lint, which is just to ease - debugging, currently there is a lot of dead code as 'hub' is never used. - - Soon, this will change, so the lint will be enabled again. - - deepcopy dicts instead - It was possible for writes to happen in nested dicts, causing global - data to change and confuse the system. - Not that I wouldn't be aware of that danger, but apparently I didn't - see the recursiveness of the call tree :). - - fixes to help more projects to build - Involving - * complete list of reserved words (keywords in Rust) - * use namespace for otherwise clashing types in cmn::, io:: - - fix name clashes - Scopes could be invalid, previosly, and the hub type could clash - with other types provided as Schema. - - Also, we used reserved identifiers - - deal with missing auth information - Now all APIs can be built successfully, which should help to - prevent things from getting hardcoded in any way. - - resource-to-category map - It allows to obtain category, which we previously dropped - - do not degenerate during activity_split - First step, next one will actually be keeping that data ... - - asssure candidate is in mapping - It seems nearly nothing can be taken for granted ;). - It's best to just run against a big set of APIs and fix issues as they - arise though. - - More flexibility means more maintenance, after all. - - intermediate improvements ... - ... it shows that the override I used previously won't work for `admin`. - Therefore we have to keep the actual value, instead of degenrating it. - - Makes sense ... it's interesting how much one tends to hard-code things - to work just for a few cases, unless you opt in to see the whole picture - - ignore beta/alpha,assure latest - There were a few bugs in the generator program, which caused old - versions to be picked up, and alphas/betas - - now with flattened activities - That way, we don't have recursive method builders, but instead - flatten the hierarchy. In our case, this is easier and means - less to type. E.g. - - `hub.user().message().import(...)` now is - `hub.user().message_import(...)` - - In go it would look like this though - `hub.user.messages.import(...) - which is neater. - - We will never have that though, as it will initialize a massive amount - of data right on the stack, even though only some of it is ever used - ... . - - first recursive resource support - However, this also means we need recursive builders, which is tottally - unsupported for now ... . - - This means we have to generalize rbuild generation ... could be easy. - Lets see - - make scope gen work with gmail - - scopes are sorted Strings now - That way, we make retrieved tokens independent of the order scopes - were passed in. Additionally, we can pass any scopes, just in case - someone uses one token for multiple APIs. - Let's keep it flexible. - - Manual scope parameter ... - ... however, it should better be a set, and there must be a way to - control certain global names using the configuration :) - - it now works in every which way - Custom property annotations have been very useful, to steer very special - cases. - - It's also good that now there is enough infrastructure to deal with - any amount of additional type parameters. - - added size and mime type support - This information must be provided, I just forgot it - - doit() call with enum type annotation - It's syntax I never used before, but ... works ! - Now lets try to get the BorrowMut back - - recursion for nested types - Drive has recursive nested types, which were not handled preeviously. - - examples section in mbuilder got lost - - filter request value props by parts - Previously, it would just show all parts. - It's still not correct though as this isn't necessarily the parts used - in the request value, but only the ones in the response value. - - It's as good as it gets though, that's all the information contained - in the json. - - method builder examples work now - It was easier than expected, but in the end needs quite some custom - code and handling. Good to have mako (and python !!!) - - have to handle required/optionals vals - Of course, it's ok to do that, but ... of course it doesn't make things - easier. However, I want these examples to be representing the real thing - - remove empty '/// # ' lines - They seem to make cargo fail to build valid doctests. Might be worth - a ticket ! - - fixed part handling,it compiles now - What's missing is docs, which will see some work now. - I guess it will be best to hide all the prelude from the user, to allow - him to focus on what's important here. - - setters now copy copyables - Previously, they would take everything as reference and clone - everything unconditionally. Now we do it only as we need to do it, - no extra work incurred. - - using visual markers now - Makes everything evaluate faster, and is good enough as well. - Besides, you don't have to think about whitespace too much, keeping - things simpler is usually better - - perfected trait recognition. - However, they are not listed as traits of the youtube api. What we - really want is to list common implementation types as part of ourselves. - - This doesn't work though as long as we don't have the common impl - as part of our sources. - - now the map is complete - It's quite nice - next up is marker traits ! - - dependency handling:dirs with timestamp - That way, make will not regenerate unnecessarily - - make all pods optionals. - That way, json conversions will always work, which is probably what - we desire (especially when handling server answers). - - now docs look good too - - unify generated constants - like library-name. That way, they are always the same, even if I - change my mind. - - Good coding style is easy, using the current setup. - - mv youtube-rs to google-apis-rs - - handle whitespace and add GENINFO - Also, remove obsolete pyra files - - is now self-contained - A little more than the promised 500 lines of code though ;). - - removed gsl, added pyratemp - As GSL failed in my first attempt to get the example program going, - it might be better to try something else before too much time is spend. - - Fortunately, pyratemp **seems** to be something usable, and even if not, - it might be possible to make it usable as it's just a 'simple' - python script that I might be able to understand, if need be. - - fixed dependencies - The make deps generator should only care about the shared xml - - forgot to add shared.xml - As XML files are ignored, I didn't see that. - - works exactly as needed. - Producing non-malformed pretty xml - - xml.tostring works now ... - ... but it still generates invalid output due to scopes. - Should be an easy fix - - make it handle top-level keys - It can now handle multiple of them. - - However, conversion fails, as the bloody xml converter can't handle - booleans ??? WTF - - make sure we get correct openssl vers. +* ToParts trait is used and implemented only when needed. +* complete list of reserved words (keywords in Rust) +* use namespace for otherwise clashing types in cmn::, io:: ### Other @@ -2453,7 +1557,7 @@ you need. - - 2863 commits contributed to the release over the course of 2766 calendar days. + - 2864 commits contributed to the release over the course of 2766 calendar days. - 642 commits were understood as [conventional](https://www.conventionalcommits.org). - 7 unique issues were worked on: [#173](https://github.com/Byron/google-apis-rs/issues/173), [#269](https://github.com/Byron/google-apis-rs/issues/269), [#271](https://github.com/Byron/google-apis-rs/issues/271), [#281](https://github.com/Byron/google-apis-rs/issues/281), [#296](https://github.com/Byron/google-apis-rs/issues/296), [#328](https://github.com/Byron/google-apis-rs/issues/328), [#357](https://github.com/Byron/google-apis-rs/issues/357) @@ -2479,6 +1583,7 @@ you need. * **[#357](https://github.com/Byron/google-apis-rs/issues/357)** - regenerate all APIs and CLIs ([`7abe6a3`](https://github.com/Byron/google-apis-rs/commit/7abe6a3de2c3de713d9d4754c880897f85a84847)) * **Uncategorized** + - prepare changelog ([`ecb10a2`](https://github.com/Byron/google-apis-rs/commit/ecb10a2ff500a1d10add9be336393a10e51ec050)) - Merge branch 'common-crate' ([`96b3d72`](https://github.com/Byron/google-apis-rs/commit/96b3d728a3b3d76c64fa0e48198d09b2d3c023bd)) - prepare google-apis-common for release ([`716c4c2`](https://github.com/Byron/google-apis-rs/commit/716c4c263a278c334feacf57c3eabbed09251a9e)) - rename `google-api-client` to `google-apis-common` ([`8d7309b`](https://github.com/Byron/google-apis-rs/commit/8d7309b78c3bc909b794d447115328cfb0f41649)) @@ -5336,3 +4441,7 @@ you need. - initial commit ([`dda8476`](https://github.com/Byron/google-apis-rs/commit/dda847607fc88ab6bb6d9646d52cd9795f7af0b3)) + +Should have been fixed alongside of #81 visual gap between cursor and kvPreviously, the space was barely visible, confusing even myself :).Now it’s clear, using 4 spaces, that there is a cursor invocationfollowed by a key-value pair. add link to general documentation[skip ci] request values are moved, not borrowed[skip ci] filled README.mdAll possible documentation was added in a quality sufficient fora first release. After all, everything there is is documented. integrate different program types random values + cursor information absolute top-level cursor + details relative cursor positioningIt would still be nice though to show absolute positioning as well. dynamic absolute cursor position exampleWe build all required -r flags using absolute cursor positions only.The next step should be to use relative ones, and of course be moreverbose about how this should be interpreted (sequential). upload and output flagWe are already there, except for documenting the request value type,which definitely deserves a separate issue. optional paramtersAdded documentation for flags setting all kinds of optional parameters. inforamtion about setting structsFor now we just have a ‘dum’ example, but once we are there, we shallmake the example and documentation based on the actual request value.This requires some additional work, which fortunately has to be donein python only. add required scalar arguments name default scope in API docs added CLI scope documentationIn addition to that, they can now be set as well.Unified generation of the ‘default’ scope. update to include CLI targets minor phrasing changesAlso removed superfluous ‘extern’ for tests deal with ‘virtual’ methods resourceWe assure to know about it, instead of writing nonsense about that‘methods’ resources which does not actually exist.I am relatively sure to have found all the spots. method features and general info add build instructionsThese should help people to get started on their own. initial versionIt’s still rather simple, but a basis for further improvements result handling and remaining todosBasically there is no todo left, which puts us in a good position forimplementing more features, and get some feedback in the meanwhile. bigger font for doc-index for additional parametersBased on the parameters suitable for the entire API. One could alsomake them available in the builder … . cross linking of resources/activitiesThis makes it so much easier to get to the example call you areinterested in.It’s getting there, slowly ;) docs for terms.upload methodsAlso fs::File is now used with prefix, to prevent clashes. scope docs for method builders fixed spacingAlso, the do() implementation was moved into it’s own def, eventhough it’s still quite empty. improved spacing added info about settable partsIt’s not as good as the parts info on the website, but it’s something !At least people don’t have to read the text, but find this informationin all the spots that are relevant to this. more information, nicer visuals method builder call exampleWith nearly fully randomized examples to show how it can be done.It’s quite nice to see actual calls, using everything required to geta call. The only thing the user has to manage is to fill in actualvalues.But, it also shows that our builder pattern doesn’t work yet due to …you guessed it … lifetime issues :D library overview as far as possibleEverything we have, feature wise, is now documented in a first versionat least.We shall keep this uptodate with what we are implementing, which alsohelps figuring out a good api. did you mean for struct values -u parsing adjust to serde usage in yup-oauth implement -u as good as possibleWe can’t have the -u style yet, buthttps://github.com/kbknapp/clap-rs/issues/88 might help with thatat some point.Related to #92 and #81 parse structure and build AppWe are currently setting everything up at runtime, and manage to getnearly all information into it, except for the more complex-u (simple|resumable) flag. initial version of command generationIt compiles and works, even though there are many things we want toimprove.One big question is how to define multi-arguments, like -u foo bar baz. setup infrastructureThis allows us to setup clap and see if it compiles, which is the primegoal of the current workflow step.Related to #81 simple linux deployment scriptIt’s made for a linux machine, not for docker simple osx deploy script improved error handlingWe are now able to decode detailed errors and pass them on. This allowsthe CLI to provide more useful error responses.Additionally, the CLI will only print debug responses in –debug mode. per-API-credentials with defaultThat way, we can provide better service, as CLIs that consume a lot ofquota can easily have their own app credentials, and with it, theirown quota.The fallback will be a project that allows to use all possiblegoogle APIs.The user can always put in his own application secret to use his ownquota or even paid services. hashmap handling repeated required argsRelated to #77 –debug-auth flag –debug flag to output traffixNice, we are totally ready to test and fix all API features.Related to #70 added first versions of all CLIThat way, changes can be tracked.Also, we make it official.Future checkins will only be made if major changes were done,similar to how the APIs are handled.Related to #64 struct value parsingThis works already for simple request values, but doens’t generatecompiling code for structures with Parts in them.Nonetheless, it’s a big step towards finishing the overall issue.Related to #64 field cursor complete and untestedTests just need to be run, and of course, the impementation might needfixing.Related to #64 make respective uppload_callNow we actually provide the information required to upload data in asimple or resumable fashion. upload flag parsingWe handle errors gracefully with costum types and minimal amount ofcode. Unfortunately, Mime type parsing is very ‘flexible’, allowingnonesense types to be passed easily.Related to #62 global optional parameters+DL trackingRelated to #61 parse method parameters and set themIt’s implemented in a working fashion, except that the default valueis not currently set to something sensible, causing duplicate errors incase the key-value syntax is wrong.Related to #61 handle output json encoding and ostreams interpret output argumentsFor now we don’t properly handle errors when opening files, but thecode is there.Will panic in next commit.Related to #63 required arg parsing + first doit() callWe are parsing required scalar values and handle parse-errors correctly,to the point were we make a simple, non-upload doit() call.It shows that we seem to build invalid calls, for now,but that’s nothingwe can’t fix once the time is ripe.Next goals will be related to finalizing the argument parsing code. infrastructure for call and dry-runNow we are able to cleanly handle our arguments on a per-method basis.The generated code won’t clutter our design as we put the details intotheir own methods. Implementation of JsonTokenStorageIt’s also used by the code, replacing the previous standing,MemoryStorage. init hub + refactor for dry-run modeThe hub is just using preset types - we will have to implement our ownstorage and auth-delegate, as well as a Hub delegate at some point.Dry run mode allows us to check for errors and use a call builderusing the very same code. Display + Error traits for Error struct engine checks resource and method argsWe are now at a spot where we can actually start parsing arguments. write default and read app-secretNext step is to cleanup the error type and implement the Error trait. create config directory, if possible infrastructureFixes #52 generate complete docopts grammarGrammar is laid out per method, providing general purpose argumentsonly as needed/supported.All details will be contained in the markdown documentation.Related to #45 per-method-markdown-filesThat way, all information can be placed within a single markdown fileper method call. This will keep loading times low while maximizingusability.That way, it’s comparable to the API documentation, which is mostdetailed on a per-method basis as well. cli postprocessing supportThat way, a single huge markdown file containing documentation forcommands and methods can be split up into multiple files forindividual inclusion in mkdocs.It’s done by a post-processor which is loaded by mako-render, providingaccess to the entire context. Said processor may also drop resultsaltogether and thus prevent files to be written that have been split upby it. docopt subcommandsSetup command/subcommand pattern.Next will be the infrastucture for documenting these, using mkdocsand markdown. bin renaming + docopt infrastructure basic usage of docoptsFor now we just show it works within our generator.Next step is to actually generate docopts grammar. mkdocs generator works nowIt can be selected for each type of program we want to build, and makessense for everything that is not a library.We also tried to unify names and folders a bit more, even though therecertainly is more work to be done to be fully non-redundant. cli depends on API, genericallyThis allows us to build efficiently. CLI programs can now have theirown cmn.rs implementation, which we can test standalone withcargo test.The primary makefile currently just explicitly pulls in the type-*.yaml,one day we could possibly put it into a loop. api generation works once againWith the new structure, it should be easy to add CLI programs withproper dependencies accordingly. Resumable upload implementedWith all bells and whisles. For now, we don’t have a good return valueto indicate that the operation was cancelled, which needs fixing. implement query_transfer_status()The delegate logic is implemented and seems sound.It’s somewhat funny that after all this back and forth, all we getis a valid start position for the upload. ContentRange header (parse and format)Now we are able to send the transfer-update requests and implement theactual chunk logic. use of oauth2::SchemeThat way, we improved our API, reduced code bloat, and are very clearabout the what we do for Authorization. crate version +That way, crate names reveal exact inforamtion about the containedAPI revision. check upload size against max-size make actual store_upload_url() callWe also assure to call only as often as we have to, keeping some statebetween the loops accordingly. improved delegate callsThe delegate will be asked for an upload URL, that he may store duringyet another call. resumable-upload infrastructureLayout the ResumableUploadHelper and implement the entire logicwith the mbuild renerator.All that’s left to be done is to implement the ‘chunked upload’ method.The borrow checker helped me to prevent a bug as well. don’t crash if json decode fails.Instead, tell the delegate about it and return the error. mark unused types with marker traitFor some reason, some google APIs define types they never use. We nowmark them, just because we can, to show our superiority ;) ;) ;) :D . support for ‘variant’ schemaDocumentation links, at one spot, have been updated as well.The variant schema is represented natively as enum, it all looksvery good.Json has been taken care of as well … . Option<_> in schema only if neededThis means that only part fields will be optional. added field aliases, were neededThis makes sure our fields can properly be decoded. use serge instead of serializeHowever, for some reason, the Serialize/Deserialize macros don’t workfor me, even though they work just fine in the respective tests ofthe serge crate. What am I possibly doing wrong ? simplify delegate callsNow we use the DefaultDelegate as standin in case there is user-delgate.That way, we save plenty of complexity as no additionalif let Some(ref mut dlg) = delegate is necesary. prevent duplicate schema typesThese could clash with types we import from Cmn. When that happens,just a single list must be adjusted for a fix, seeunique_type_name begin()/finished() callsDuring begin(), the delegate receives additional information about thecurrent call, which can be useful for performance tracking, amongother things. alt ‘media’ handling to allow dlsThis also includes documentation to state which methods actually supportmedia download, and how to achieve that.Added TODO to not forget we should tell the user how to achieve thesekinds of things. crates with ‘google-’ prefix allow to set user-agent optimizations and simplification; seek optimized memory allocation and optionsThis increases the possible performance, and makes for more readable,concise code. multibytereader single byte testIt shows that we actually don’t handle our state correctly.The first test which reads to string obviously uses a big-enough buffer. MultiPartReader is working.Something that is missing is a single-byte read test initial part writingWe are a state-machine, and handle parts of it correctly.However, we don’t yet write the boundary at all, and could improveour use of match. multi-part mime-type and add_parts()Next we will implement the actual Read method handle ‘alt’ paramIt’s conditionally set to json, if we expect a response value. more multipart infrastructureThere is more work to do, as it currently doesn’t compile, nordo we deal with our streams correctly.But we are on a good way. improve body infrastructureThis will support choosing custom readers at runtime, depending onwhether we have a resumable or simple media upload. simplify URL_ENCODE handlingMore maintainable template code, with less redundancy. uri-template handling completeWe now handle url-encoding for the parameters that would require it,and can deal with repeated params that will match ‘/param*’. uri-template generation worksThis doesn’t mean it’s correctly implemented, but we are on our way.It does compile, at least repeated types in examplesMade sure usage examples know how to use repeated types. repeatable parameters workingThe code dealing with them currently assumes they are “/” separated. intermed. support for ‘methods’These ‘methods’ have no resources, and need slightly special handling.This version at least makes the generator work, even thoughit produces duplicates.However, as it is so ugly, I’d rather consider to change itsubstantially … this feature should just come naturally. partial implementation of url exprURL expressions allow to substitute values within the URL withparameters. However, this is not only a simple key-value replacement,but supports expressions that need a parser.This one will have to be implemented next. set upload media typeRelated to #17 add more obvious crate and api version pre-request delegate call.This one is likely to change the further we advance in the upload-mediaimplementation. json decode and delegationNow json errors are handled and delegated with the option to retry,and all other values are just decoded according to plan.For now, I brutally unwrap json values assuming this will work, becauseit really should work. authentication with and without scopesIt’s quite rough around the edges, but has a slight chance to work.Will still to handle return values accordingly. attempt to send json-encoded requestThis doesn’t work yet, as I am unable to unwrap the client properly.It’s a refcell that contains a BorrowMut to a hyper::Client, andlets just, it’s complicated. add cargo.toml dependency information docs and tests of youtube3 on travisThis might already bring it close to 7 minutes runtime, which seemslike providing us with a buffer big enough for when it isfeature-complete. update-json using discovery APIInstead of depending on the google go client API repository, I nowuse the original data source, namely the discovery API. full usage example on landing pageRelated to #4 oauth22 -> oauth2_v2Related to #3 improved library namesRelated to #3 new github-pages targetFor import of all docs to the github now we pre-generate nested schemasInto a complete, global list of schemas, with additional meta-data.However, it’s currently not complete, as $refs are missing.There is some resemblance to to_rust_type(…), which worries meslightly part 1 to implement ‘any’ typeIt is a Json object, with a schema as defined elsewhere. It’s quitecool to see this (nearly) working already. However, it will requireus to transitively assign the required markers which is basedon information we don’t currently have.Maybe implementing this could also help to simplify name-clash checksor make them better at least ? build all apis, were possibleNow there is a blacklist feature, allowing to list apis we can’t yethandle for whichever reason. new Scope enum typeFor use in all places where scopes are desired. It will also be madeavailable for adding scopes by the user. scope as property …… however, it will become an enumeration, as I don’t like peopleputting in strings all by themselves. This also means we have togenerate good enums ourselves. query string setupIt works for uploads as well as for others.Next up is to setup the head and authentication. It will be as simpleas calling and handling GetToken, even though I think that thereneeds to be better support for the scope that is asked for … . generic result type… and we actually add additional fields to our fields list. additional fields and Result typeNow query params are handled completely, including clash check.Additionally, there is a new result type which encapsulates everything.It must be typed to the actual result type though, which can be arequest result put all fields onto a listAlso handle the case when the ‘part’ field is generated from therequest. Additional params still need work spike to see how delegate can be workTo avoid an additional type parameter, we will use dynamic dispatchfor the delegate.Having function overrides at some point seems like an excercise betterleft for version 1.1 ;) first attempt to get it to workWith a big but ! The most simple thing to do it was to just addadditional type parameters to the respective method.Now the type cannot be inferred, which means type-hints must be added.This should be easy enough, but … has to be done somehow. media-upload doit() methodsIt’s just a first step, and even though the generation works well,I am still missing the right Rust code. Will have to simplify … param() to set any parameterThat way, things like drive.files.insert alt=media has a chance to work.We should actually check for this to support various ‘alt’ values added gogole drive APIJust to have another, different set of api information to deal with,and not accidentally hard-code things to work with youtube only.Prepared dealing with media uploads, and it turns out to be best toadjust the ‘doit()’ to take the respective type parameter.We also have to think about downloads, like the ones for google drive,which requires custom query parameters. ground work for upload mediaThis might mean we need additional type parameters, but I will see howit’s going to work out.In theory, we could define a new trait for Seek+Read, but this wouldmean that we couldn’t contain owned streams.For max flexibility, it’s better to have additional type parametersand use BorrowMut to allow ownership, and borrow. request type handling part 1Now we will generate proper resoure methods builder calls to instaniatethe more or less valid method builders.However, it doesn’t compile yet, and the ‘to_parts()’ method onresources is still missing. build insert/update … methodsIt’s just the first version which defaults everything.Required parameter lists still have to be built.It’s not going to be a problem at all. properties and setters for mbuilderThis includes descriptions, of course, and generally seems to lookquite neat. For now, we brutally consume all input to own it,but in future we might be able to put in Borrow to support them all. infrastructure for method buildersNow comes the actual work of setting them up.Additionally, the docs were decluttered to show comments onlywere necessary. Now the code path to getting the hub is as concise aspossible. Partial MethodBuilder implIncluding documentation at least on the method builder part. Thegreat thing is that fully working examples are now included onevery type !Now more involved part starts … namely setting up the individual callmethod signatures. defs are now more readableThis works with a new indent and unindent filters respectively.There are a few things to consider, but I have understood how it worksand can handle it.There is some overhead just to give me nicer visuals … might choosea different route, like annotations. generate hub implementation and docsThis includes docs for the library usage.It’s totally great to be able to paste example code right were itbelongs, and also put the same elsewhere to compose more complex docs. def for DO NOT EDIT commentsA note like that is now added to all files we generated, commented outdepending on the file type.Quite neat, except that for filtering, I always have to use blocks. Traits now show up as part of libPreviously, they were in an extra, oddly named crate.Now we just make it a part of our generated codebase.That way, traits, and common code, shows up as part of the library.Fair enough.This also means that the types ar not reusable.Maybe a mixed-mode can be used if that is desired. add marker traits to schema typesBased on their involvement in activities.It nearly works perfectly. LUTs and context to make better docsNow a context is passed to utility functions, which contains the statethese may be interested in. This keeps it clean from global state.With the lookup tables, it is possible to figure out relations betweentypes and document them accordingly. first generated result …… just to keep track on how it changes over time. generating valid rust from schemasIt’s very nice, even though there is some more work to be done here.It’s just the beginning … . now sets up entire project structureThat way, we have a common library to pull in from the main repository,and a space for testing new code (in a partial implementation).Next there will be generated object structures. improved license information… and readme, and looks of author listing.Slowly getting into the flow, possibilities seem thrilling. LICENSE + README.mdReadme is very initial, but the architecture is set to evolve it tosomething no less than beatiful. mako-render generates output dirsThat way, the makefile doesn’t need to know that much anymore, andgets simpler/less verbose.# Also apis target - make all apis can now use custom libraries in pycodeNamespaces can exclusively be used during rendering, which is fine ifyou remind yourself of the newline rules.However, I also need some utiltiies that convert input data. Theseare now within their own libraries, which can be used from python blockslike the ordinary python functions they are.Quite neat.In future, most of the functionality will be in separate namespaces,the top-level will just assemble the main library file, usnig theprovided %defs. That way, the main file is kept clean. cargo.toml templateIt’s quite final, and super easy to change and to read.It seems we want to use namespaces/shared implementations soon to allowusing defs. In our case, we transform the version in a particular way,which is easy enough, yet I’d like to use it to make the system morepowerful. generic source/output mappingsThis includes proper handling of dependencies.The code is concise, pythonic and quite ‘cody’, but does the job justfine. multiple input-outputs per callThat way, we read the data files only once, but produce all the outputswe need. Together with a powerful makefile, we have a multi-invocationwith proper depedency tracking.Everything will be regenerated though, even though just a single inputtemplate file changed.The alternative would be to have one dependency and invocation perinput dependency, but that will read the entire json each time.Let’s see what’s faster/more useful during development. api deps generation worksIt’s very pleasant to use, and worth the slightly greater effort. mako autosetup and improved executableNow we can write mako templates, with a similar feature set aspyratemp. Except that its syntax is nicer, allows to do everythingand that there is syntax highlight support.Let’s see how it fares successfully generating make depsAfter minor modifications to pyratemp, it certainly does the job.What it does NOT do:It will do the job nonetheless, but mako might be worth a look my first gsl program …And it crashes on linux and on osx.What am I doing wrong ? unified make based build systemAdded all prerequisite programs in binary for easier use.Make is now implemented top-level, and is not expected to do too muchwork actually. It will, however, keep track of all requiredgsl invocation and make sure calls are efficient by not havingto rebuild everything every time. That’s what make does, anyway ;) added authenticator argThat will allow interaction between client and authentication attempts.It also shows how cumbersome it is to deal with all thesegenerics … but hey, you gotta do what you gotta do.If boxes of pointers would be used, it would be easier to handle, butenforces a certain memory model. That, of course, is not desired. makefile for handling json-to-xmlThat way, it will remain clearly documented how to do this, and allowfor efficient calling of gsl as well, at some point.Of course it will be a little more difficult for us to know alldependencies, but gsl could generate these as well for us, I suppose. add conversion tool and youtube apiThe json file needs to be converted to valid XML, which should bedone by a soon-to-be-modified xml2json tool. first primitive types and apiNow it should be possible to implement first version of actualinsert handling, with everything there is about it.That should eventually help to generalize it, as I am definitelynot going to hand-implemented these protocols … .The great thing is, that if done right, one will be able to truly befirst and make an impact ! improved module layoutAs there will be plenty of types, it will be better to split it up.Also learned something about self:: :).Insert and and update should be hand-implemented just to see how it’sworking. Then there should be some investment to auto-generate thiswith gsl. Once the latter works … I could auto-generate all apis,or adjust the go generator to create rust instead.Depends on what will be faster … . figure out ownership modelThere is a central YouTube type which helps constructing varioussub-builders, which in turn provide individual functions.Architecturally, it’s very similar to the go implementation, butmore efficient memory wise. initial commitBase project with everything it will need to grow:Possible issue:I swapped from looping over each key in tc.keys() to assuming the keyswill only ever be [“api”, “cli”]. This hard codes the keys instead ofgetting them dynamically, but makes it easier to format as a table andlets you pull a lot of the logic out of the template and into a singleblock before each table row.If the types of application in tc.keys() ever changes then thistemplate will need to be updated accordingly! use new serde map implementationNo fun, this one. build better dataReally just what is needed right now to make it work.[skip ci] make cli publishing workIt really needs allow-dirty.Let’s hope that won’t publish too much. try to depend on major version of apiPreviously that didn’t work due to a bug in carg,but should work now. cli + api use a single base versionThat way we get rid of the duplication at least.Probably it would be enough to just refer to version 1 of thelibrary respectively, and let semver do the rest. correct link to license on github[skip ci] handle discovery urls with $Some google discovery URLs contain $discovery or other variants,causing the calls to wget to interpret $d as an environment variableinstead of a literal. An example is:https://logging.googleapis.com/$discovery/rest?version=v2.To fix this, the $ has been escaped so that wget fetches the URL asexpected. Add an unused field to empty API types.Null structs (struct Foo;) cause the following error when trying todeserialize an empty JSON object {} into them:JsonDecodeError("{}\n", Syntax(InvalidType(Map), 1, 1)) (also known asinvalid type: map at line 1 column 1: {}). The optional struct memberprevents this error. URL-encoding ‘/’ in URLs is not accepted by Google APIs. use redirect flowThe interactive flow requires to paste a code back into thecommand-line, which does only work when it’s cat’ed, but notif it is pasted.This should let it handle everything internally, which isway more user-friendly. relative path for custom target dirUsing a shared target-dir is important to keepdisk-space usage in check and speed up builds. don’t fail by default on non-nightly use working version of serde-codegenThis update fixes the build on stable, and allows buildson nightly as usual.The trick is to use the latest version of serde-codegen,which keeps the syntex version internal, preventing clashesbetween libraries that might have different requirements. as learned from yup-oauthThat way, there is no redudancny anymore. work with latest serdecargo test will work now.We now use the latest serde once again, which shouldmake everything better. remove cargo/configIt seems due to a so far possibly unfiled bug, cargo fails toget it’s CWDs right.Last verified with cargo 0.11.0-nightly (42bce5c 2016-05-17).To reproduce, just put the deleted file back and run a build command,such asbashmake drive3-cli-cargo ARGS=build --no-default-features --features=nightly + use hyper Bearer header styleConsidering we kind-of hardcoded this authentication type anyway,we now use the Auth-types provided by hyper 0.8.The incentive here was the compiler telling us that there theyup-oauth::Scheme type doesn’t implement the hyper::authorization::Schemeanymore, even though that clearly was the case. Also it couldn’t bereproduced in yup-oauth itself.This will need some work to get correct again, so this is just a crudepatch to make it work again. compatibility with serde 0.60.7 has a weird assertion error that might have happenedif files get too large. choose serde-version which worksEverything newer than the ones we see here will causethe error described in #148. use venv-python to run any utilityPreviously the yaml version generation could fail if your system-pythondidn’t have yaml installed. Now the virtual env is used, which isguaranteed to support yaml. use latest oauth2 libIt enables using std::time::Duration natively use new discoveryRestUrl field for json download use std::Thread::sleepHowever, in sibling libraries, we still use time::Duration, whichnow is a part of std::time::Duration.These should be adjusted, to make the usage ofsleep(Duration::from_millis(d.num_milliseconds() as u64)) into sleep(d) improve handling of error code if stable is tested get cmn compiling on nightly rust assure license can be generated use PYTHONPATH for mako invocationThat way, it will find its resources. improve version and library name handlingWe can now deal with versions having the ‘alpha’ or ‘beta’ suffix.It’s rather hard-coded, but solves the problem for now.Related to #126 update to latest serde/rust update to serde 0.5.0Serde move all json code into a separate crate that we are now usingas well. use clap 1.0.3 compatibility with hyper 0.6.4Closes #123 adjust linux script to target dirPreviously it attempted to find build-artifacts inthe ‘gen’ directory, now these are all found in‘target’, provided cargo 0.3.0 is used.[skip ci] flush output stream on CLI outputFor some reason, this is now a requirement - previously this didn’tseem to be necessary.Don’t know what changed there … and it’s odd it doesn’t flushwhen the process is going down or the handle is destroyed. work with hyper v0.6.0Currently the latter actually fails to link on OSX, and requires a localoverride with https://goo.gl/OTExmNthis fix. type-inference fails on empty vecPreviously this wasn’t the case, as the type could be inferred by thetype of the parent-vector to extend.Apparently this feature was removed, probably for good reason. make statement shell compatibleThe previous one actually required bash, instead of sh add type annotationIt seems to be required when building with an older rustc version.This did work in nightly, and just seems to be some sort of limiationin stable. work on stableCLI was slightly adjusted to not use unstable features.Fortunately, there is no serde magic happening, which allowsus to keep it simple without using a build script. minor fixes expanded header implementationNow it compiles to the point where Mime appears as duplicate type,for some reason. first big step towards syntexEven though there is a bug that caues {} to be used in stead of(),when exanding macros, which causes syntax errors that we have toworkaround, it’s not a real issue.What’s happening additionally is missing hyper macros, whichnow have to be expanded manually. Shouldn’t be a problem,pretty-printing when compiling is made for just that ;).No, it’s sad that include!() works so badly, it makesusing serde so difficult … it’s no fun i must say.Just for stable … I am not sure if it is worth it.“ clean was depending on unknown targetsThere are no per-program-type docs clean, just made it depend ondocs-all-clean.Also added the docs-api|cli target to the generated per-program-typemake help. It was just missing, even though it existed. fix clean target for docs/cliclean-all-docs and clean-all-cli aren’t valid targets. The current makotemplate causes make clean to abend reporting that it can’t make thesetargets. URL substitution handlingPreviously we would remove the wrong parameters when attempting toremove only those parameters that have been used in the URLsubstitution.The code we have now is more idiomatic and appears to be removing thecorrect parameters. dc630d01e 2015-05-09[skip ci] deal with rustc lifetime issueRelated to #109 limit tar.gz to executablePreviously it could re-pack tar-files and mess everything up.[skip ci] osx-tar files without directoryPreviously, they contained the parent directory, which wasn’t intendedand was different from the plain-layout dictated by the linux versionof the script.[skip ci] filter null values of requrest structsSome servers, like youtube, reject null values possibly thanks tothe reliance on parts. Now we are filtering them (in a very inefficient,but working way), which seems to be fine with the servers.Effectively, we seem to be able now to upload videos … .More testing required ! upgrade to hyper v0.4.0It was basically just a find-and-replace to adapt to the changed namesof Error and Result types. completed list of parameter namesPreviously the ‘did-you-mean’ functionality only knew the globalparamters, but not the method-local ones. simplified call to form_urlencodeIt now supports more generic inputs, as suggested in a lenghtydialog on a corresponding github issue.Required to build with >=0.2.33 added latest reference CLI codeJust to have something to link to gate usage of upload_media_paramsPreviously the local stack variable would be used even though itwasn’t initialized as there were no upload flags. Now this onlyhappens if there are media params.[skip ci] let delegate forget uploaded urlsWhen uploading using the resumable protocol, we are now telling thedelegate to forget the previously stored URL after successful upload.Previously it would have tried to return such a URL and thus madethe system retry uploading a file that was already uploaded. handle repeated required stringsIn a single case we wouldn’t properly pass on string arguments thatwere repeated. Now we handle them with a nice one-liner. ‘about()’ text for main commandsIt shows up in the help, and makes it easier to navigate the commandtree without bringing up the html documentation. adjust JsonTokenStorage to yup-oauthSignature of set() changed to return a Result<(), _> instead ofan Option<_>.Related to https://github.com/Byron/yup-oauth2/issues/5[skip ci] unified error handling escape subcommand descriptionsOtherwise, we could have had invalid rust strings.[skip ci] remove unused std_misc featureHopefully this will not trigger errors elsewhere, but we willjust find out I guess ;) adjust to latest hyper header macros re-introduce UploadProtocol,fix CallTypeAll tested APIs seem to build, verified update docs and fix calltype handling various fixes and improvementsRelated to #81 print usage if command is missingAlso, fixed config-dir substitution in flag’s help message tweaks to make youtube3 workMainly minor cleanup, and handling of generator branches thatdidn’t show up in smaller APIs that were used during the first steps.related to #81 adjust option usage to changed APIDiscovery API now builds and seems to work even ! More testingwill have to be done though to be sure. handle apis without media uploadWe are annotating the type of the optional protocols if that shall berequired. call iter() directlyAs IntoIter is only implemented for slices up a sice of 32.DFAReporting though will reach 55, at least.Also added dfareporting-cli code to show how stackoverflow issues can becircumvented efficiently. commit before un-using UploadProtocolWe will try to wait for https://github.com/kbknapp/clap-rs/issues/87to allow us to use the enumeration instead of strings, as well asan iterator, which will look more idiomatic in the end. generate command data structureWe do this in the hopes to circumvent a stack overflow.This means we will setup the parser entirely at runtime, which actuallysaves a little bit of code. upload some code to help debuggingWe get a stack-overflow when trying to run the dfa-reporting program,and right now I don’t know how to workaround it.This could be preventing us from using clap. make it work with latest hyperThis is known to work with the master of hyper. It’s probably OKto keep it, preparing for the next release and under the assupmtionthat I will not be releasing binaries for a while. exclude cloudsearch from buildIt doesn’t have a single method, and thus is useless code updated to v0.1.6, latest CLI CLI + API release preps update changed url crate imports request value cursor handling and docs simple and resumable upload worksRelated to #76 use only one request structureThis works as we can just put all request-structure parsing to the topof the function.That way, we don’t put the request struture twice. set request value to callPreviously, even though the request was passed by reference, it wascopied and thus our changes never arrived in the call.Now the API makes this clear by taking ownership, and the CLI codesets the Request value lateron, explicitly.Related to #76 verified download works update all code to latest version response value json decoding implement deletion of tokensPreviously this case was entirely uncovered.Interesting note: when a token is revoked, existing auth-tokens willstill work. However, you may not refresh them in case permissionshave been revoked. It’s good as there is only one code-path to dealwith (and we verified it to be working), and bad for the user asmalicious software can keep using an account for certain time untilthe token expires. adapt to changed yup-oauth2 APIThe latter changed a lot, to the better, and we handle the newreturn types accordingly.Related to #74 resolve generator issues update make targetAlso, generate CLI. Probably there is not enough time to build it. README info + fix author emailPlease note that docker build script is still in debug mode, thisissue will remind me about it: #72 scopes were used illegallyWhich caused a compile error. This was fixed by assuring the codeuses the same function to determine whether or not scopes are presentper method.[skip ci] (abf0548b5 2015-04-15) (built 2015-04-15) latest version of all APIsNow CLI and API and the same level request value parsing compiles and initsTherefore, you can now actually make the call and the program will notcrash due to uninitialized Options. struct access compiles …… but currently wouldn’t run as we don’t initialize the optional sub-structures at all. corrected cursor handling in mkdocsThe trick was to use an actual list of cursor tokens that is consumedon use. That way, we don’t loose track of were we are in thestructure.Related to #64 NULL default values instead of randomsInstead of generating nonesense random values, we just map defaultsthat represent the respective NULL value of a given type. alt-media handling in CLI+API-docs optional parameter default handlingNow we provide a matching default for each paramter, thus alleviatingthe previous issue of unncecessary follow-up errors. add rustc_serialize to test-crateA top-level cargo test didn’t work anymore thanks to a missingmention of rustc_serialize.[skip ci] optimze argument handling and conversion Display for Errors + refactor improved scope handling; fix CLIRelated to #48 add commands.yml.makoIt was previously hidden thanks to .gitignore.Good I made a fresh clone to see if make really really works. dependencies are now per-program-typePreviously we put cli.py into the common lib folder, which caused theAPI to be regenerated and rebuilt whenever we changed code that willonly affect the CLI, causing terrible turnaround times.Now the dependency is fixed. ‘bytes …’ -> ‘bytes=…’ better subtext + rename target one folder per API docsOtherwise, it would overwrite its search index, effectively breakingthe search field.We might run into space issues on github, as the generated docs areduplicating each other and use a lot of disk-space. use bytes=… when sending as wellPreviously, bytes= was just parsed, but not sent to the server.This change is motivated by a similar change in this commit:http://goo.gl/AvyvLb fix dependenciesThat way, we don’t build documentation unless this is truly necessary add publish state v0.1.5 corrected absolute linksThis only affected links in readme files, not the relative onesin the actual documentation v0.1.5 adjust to hyper client v0.1.4 adjust invalid make target v0.1.4The latter pointed me to the serde issue, which would have madeeverything fail when actually used to communicate with google servers. v0.1.3Related to #46 version 0.1.3 rustc (be9bd7c93 2015-04-05)Related to #46 github-pages index generationPreviously, we forgot to pull in the new type-specific dataset, whichcaused the index.html.mako file to fail. check-in of latest sourcesThis also includes crate files to remember whichcrates we have published already.Related to #44 set the API version to 0.1.2 incl. Result conform to standardsRelated to #44 remove newlines interpreted as testWhen documenting mandatory arguments of a methods builder, it waspossible to document ‘parts’, which have a long multi-line descriptionwith empty lines inbetween. This caused the indentation to be funnyand tricked rustdoc into thinking these are indented doc-tests.Now we remove these empty lines, also hoping we dont encounter lineswith just whitespace in them. The latter would require a tiny updateof our regex. remove custom Result EnumInstead, we just use a customized Result tyoe and thus stick tocommon Rust conventions. update json files from discovery API typo fixes and misc. improvements whitespace and trait rename upload size now taken properlyPreviously, it would query the size from the wrong dict and obtainthe value 0 all the time. This would have made every upload fail withUploadSizeLimitExeeded.Now we obtain the actual size limit, and will ignore it if unset/0for some reason.Patch += 1 0.1.0 release upload() return value handlingNow deals with Cancellation and non-OK status codes correctly. re-export types used by delegateOtherwise, delegate implementation might not actually be possible. better introduction and version handlingMake it cristal clear what the crate version means, and what version ofthe documentation you are looking at. Also do this in the README file.Assure that ‘Google’ is capitalized. repository/source-code linkPreviously it pointed to a timestamp file. Unified repositorysource code link generation, and simplified ‘deps.mako’.Related to #38 simplification and cleanup schema_markers() accessed map incorrectly prune unused and ToParts traitLinters are back to ‘normal’. pretty names for methods and resourcesPreviously, it was possible for methods to have ‘.’ which showed upin the documentation. Now these are replaced with ’ ’. exclude those with recursive schemasThey currently don’t compile as Box ‘serde’ is not supported.See https://github.com/erickt/rust-serde/issues/45.Related to #34. make recursive types possibleMust be Option> now, whereas a simple Box workedpreviously. Anyway, serde can’t decode/encode Boxes yet, soplus1 was removed from the list of APIs to build.Related to #34 MethodBuilder -> CallBuilderFind-and-replace. It seems to build and work correctly, still improved markdown for library overviewAnd names of free methods, which previously contained ‘.’. These arenow spaces. just add latest youtube codeIt’s good to see what actually changed in the json realm. Vec/HashMap are OptionalsThat assures that we can decode partial server responses, and sendpartial structures as well. serde cleanup;JsonError pub fields prevent type-clash with ResultThis should have been fixed in previous commit, but wasn’t.Actually a change that fixed one API, broke the other, and vice-versa.It’s kind of a hack, because it’s tailored to fix particular cases only,yet I believe it’s contained enough to say ‘good enough’. some links pointed to old doc nameWith one of the recent changes, the crate name was changed to bedifferent from the library name. However, there were still plenty ofplaces that would refer to the library name instead of the new cratename.That way, links in the README.md as well as index/index.html stillpointed to the old location. MultiPartReader test caseSimple fixes, required as its API changed MultiPartReader now works correctlyThe state-handling was incorrect, causing it to not handle small readscorrectly.However, this is working nicely now. fix lifetime issuesThose were totally real, actually, and I am happy the borrow checkerexists !Only one weirdness happened with RefCell>, but it could befixed by checking actual types using let foo: () = something_nasty(). repeated params string additionIt seems odd that String + &str is required.In any way, previously it would try to add i32 to a string. repeated parameters docs improvementPreviously, it said it would ‘set’ the property, which is not the caseafter all. regenerate .api.deps less oftenIt took too long to do it, so the ‘MAKO_LIB_FILES’ dependency wasremoved. It can be re-added if needed. decent solution for free methodsNow I just add a ‘virtual’ resource, which is called ‘methods’.The good thing about this is that it works well standalone, orin conjunction with actual resources.Also the system works with it just with minimal changes unit-tests work once againAdded missing Result cases to make match exhaustive remove BorrowMut until it’s clearedSee stackoverflow at http://goo.gl/f27zJkj.Now we can actually call out client and move on with handling the result user lower-case library names,always force python2.7 in virtualenvforce the usage of python2.7 on systems where /usr/bin/python points to python3.xfixes issue #12 incorrectly capitalized cargo.tomlThis caused cargo on a case-sensitive file-system not to find thecargo file, which made it to look upwards in the directory structureto find the correctly named Cargo.toml fo the ‘cmn’ developmentproject. explicit subshell for cargo-docPreviously, it was only executing for cargo $(ARGS) try using a subshell for cargo cmdApparently travis doesn’t execute cargo in the right sub-directory.Might be a difference in the way make worksRelated to #8 fixed dependency to wrong targetWhich caused the cmn.rs to be missing, and the build to fail. install virtualenv automaticallyThe only dependency we really have is python, and wget.Pip is not needed ! fully qualified activity names Do not generate docs !Previously, travis would continuously overwrite my combined docs withthe ones from the dev-project, and make them useless.This has been driving me nuts ! Good to have it fixed ! added milestone linkIt’s important to the project, and should thus be listed there use function to make links correctlyIt will automatically handle rust-doc mode, and use relative or absolutelinks respectively. assured it handles ‘0’ correctlyPreviously, it could possibly take a ‘0’ away from the start of aversion. Now this is definitely not possible anymore. make ‘regen-apis’ workThanks to changes in mako libraries, it won’t work anymore withoutthe template directory set typo fix incorrect nested type namesThere was a name-duplication which led to un-inmplemented types.The good thing is that this was the last issue that kept all 72APIs from compiling. finally, we pick up all typesHashMap types were missing previously, but now it seems to be pickedup quite nicely.Would this mean we do the type-setup correctly, everywhere ? transitive, minimal traits for typesPreviously, I would just assign all useful traits to all types, nomatter on how they were actually used.Now it builds all dependnecies and considers them when assigningtraits, which is as precise as we need it.This is important to us as the Json type is just encodable, butnot decodable. Fortunately, we just have to encode it, but in theorythis makes it hard to embed any json in a known structure. no unused types anymoreDue to shared global state which was altered, we got wrong results.This is fixed now, thanks to a deepcopy. Amazing, how altering globalstate is always biting you, even though you are convinced it’s safeto do in just this case !General rule: Just don’t do it, no matter what ! improved camelCasingPreviously, it was possible to get types like Foo_bar, which is notdesireable.Now it is totally impossible to see such blasphemy ;) protect from nested-type-clashIt was possible for a nested type to be generated with a name that infact CLASHED with an existing schema type. What are the odds !The clash-check added will just verify against clashes with schematypes, which seems to be doing it for now. nested type names are consistent nowAt least so it appears.The implementation doesn’t look totally clean to me, as it seemssimilar concerns are in different portions of the code, which wasmerely tuned to work together.It could break appart if someone - me - wants to change it sometime scope -> add_scopeThis is not only more precisely saying what it does, but also doesn’tclash with scope parameters on resources ;) (happened in dfareporting) improved nested array type handlingIt needs deduplication though, coming up next prevent struct recursion issueThis works by just boxing types which are nested within themselves,breaking the recursion. nicer code and identifiers nested types work for arraysThanks to removed code which made no sense to me, I put in a bug.Now the code is back, beta than ever, and documented as well :). now deals with non-objectsThese are arrays or HashMaps, which are nested types too. This is usedto have custom types of standard vectors or hashmaps, which resolveto NewTypes in Rust. optionals are working once againA bug was introduced which caused nested-types not to be optionalin situations were they should. nested type resolution and hashesIt seems we do it better than the actual Go implementation, which failsto detect that scopes are actually having a string member.However, there still is an issue, as it’s as hashmap for us, but justa member for go … lets see …https://developers.google.com/discovery/v1/reference/apis#resourceshows that we implement it correctly :) !! remove compiler warnings.Also, a build issue was fixed when schemas were no objects.However, I think I will have to check this one anyway no compiler warningsThis involves disabling the dead-code lint, which is just to easedebugging, currently there is a lot of dead code as ‘hub’ is never used.Soon, this will change, so the lint will be enabled again. deepcopy dicts insteadIt was possible for writes to happen in nested dicts, causing globaldata to change and confuse the system.Not that I wouldn’t be aware of that danger, but apparently I didn’tsee the recursiveness of the call tree :). fixes to help more projects to buildInvolving fix name clashesScopes could be invalid, previosly, and the hub type could clashwith other types provided as Schema.Also, we used reserved identifiers deal with missing auth informationNow all APIs can be built successfully, which should help toprevent things from getting hardcoded in any way. resource-to-category mapIt allows to obtain category, which we previously dropped do not degenerate during activity_splitFirst step, next one will actually be keeping that data … asssure candidate is in mappingIt seems nearly nothing can be taken for granted ;).It’s best to just run against a big set of APIs and fix issues as theyarise though.More flexibility means more maintenance, after all. intermediate improvements …… it shows that the override I used previously won’t work for admin.Therefore we have to keep the actual value, instead of degenrating it.Makes sense … it’s interesting how much one tends to hard-code thingsto work just for a few cases, unless you opt in to see the whole picture ignore beta/alpha,assure latestThere were a few bugs in the generator program, which caused oldversions to be picked up, and alphas/betas now with flattened activitiesThat way, we don’t have recursive method builders, but insteadflatten the hierarchy. In our case, this is easier and meansless to type. E.g.hub.user().message().import(...) now ishub.user().message_import(...)In go it would look like this though`hub.user.messages.import(…)which is neater.We will never have that though, as it will initialize a massive amountof data right on the stack, even though only some of it is ever used… . first recursive resource supportHowever, this also means we need recursive builders, which is tottallyunsupported for now … .This means we have to generalize rbuild generation … could be easy.Lets see make scope gen work with gmail scopes are sorted Strings nowThat way, we make retrieved tokens independent of the order scopeswere passed in. Additionally, we can pass any scopes, just in casesomeone uses one token for multiple APIs.Let’s keep it flexible. Manual scope parameter …… however, it should better be a set, and there must be a way tocontrol certain global names using the configuration :) it now works in every which wayCustom property annotations have been very useful, to steer very specialcases.It’s also good that now there is enough infrastructure to deal withany amount of additional type parameters. added size and mime type supportThis information must be provided, I just forgot it doit() call with enum type annotationIt’s syntax I never used before, but … works !Now lets try to get the BorrowMut back recursion for nested typesDrive has recursive nested types, which were not handled preeviously. examples section in mbuilder got lost filter request value props by partsPreviously, it would just show all parts.It’s still not correct though as this isn’t necessarily the parts usedin the request value, but only the ones in the response value.It’s as good as it gets though, that’s all the information containedin the json. method builder examples work nowIt was easier than expected, but in the end needs quite some customcode and handling. Good to have mako (and python !!!) have to handle required/optionals valsOf course, it’s ok to do that, but … of course it doesn’t make thingseasier. However, I want these examples to be representing the real thing remove empty ’/// # ’ linesThey seem to make cargo fail to build valid doctests. Might be wortha ticket ! fixed part handling,it compiles nowWhat’s missing is docs, which will see some work now.I guess it will be best to hide all the prelude from the user, to allowhim to focus on what’s important here. setters now copy copyablesPreviously, they would take everything as reference and cloneeverything unconditionally. Now we do it only as we need to do it,no extra work incurred. using visual markers nowMakes everything evaluate faster, and is good enough as well.Besides, you don’t have to think about whitespace too much, keepingthings simpler is usually better perfected trait recognition.However, they are not listed as traits of the youtube api. What wereally want is to list common implementation types as part of ourselves.This doesn’t work though as long as we don’t have the common implas part of our sources. now the map is completeIt’s quite nice - next up is marker traits ! dependency handling:dirs with timestampThat way, make will not regenerate unnecessarily make all pods optionals.That way, json conversions will always work, which is probably whatwe desire (especially when handling server answers). now docs look good too unify generated constantslike library-name. That way, they are always the same, even if Ichange my mind.Good coding style is easy, using the current setup. mv youtube-rs to google-apis-rs handle whitespace and add GENINFOAlso, remove obsolete pyra files is now self-containedA little more than the promised 500 lines of code though ;). removed gsl, added pyratempAs GSL failed in my first attempt to get the example program going,it might be better to try something else before too much time is spend.Fortunately, pyratemp seems to be something usable, and even if not,it might be possible to make it usable as it’s just a ‘simple’python script that I might be able to understand, if need be. fixed dependenciesThe make deps generator should only care about the shared xml forgot to add shared.xmlAs XML files are ignored, I didn’t see that. works exactly as needed.Producing non-malformed pretty xml xml.tostring works now …… but it still generates invalid output due to scopes.Should be an easy fix make it handle top-level keysIt can now handle multiple of them.However, conversion fails, as the bloody xml converter can’t handlebooleans ??? WTF make sure we get correct openssl vers. +