As IntoIter is only implemented for slices up a sice of 32.
DFAReporting though will reach 55, at least.
Also added dfareporting-cli code to show how stackoverflow issues can be
circumvented efficiently.
We get a stack-overflow when trying to run the dfa-reporting program,
and right now I don't know how to workaround it.
This could be preventing us from using clap.
We are now able to decode detailed errors and pass them on. This allows
the CLI to provide more useful error responses.
Additionally, the CLI will only print debug responses in --debug mode.
Fixes#82
Which caused a compile error. This was fixed by assuring the code
uses the same function to determine whether or not scopes are present
per method.
[skip ci]
That way, changes can be tracked.
Also, we make it official.
Future checkins will only be made if major changes were done,
similar to how the APIs are handled.
Related to #64
* macro 'alias' was renamed to 'rename'
* fixed `cargo test` on main project
The latter pointed me to the serde issue, which would have made
everything fail when actually used to communicate with google servers.
* keywords are no longer than 20 characters, which is a restriction
cargo imposes
* don't use 'homepage' link in cargo.toml unless the homepage is
non-empty
* Added all publish-results to mark the respective crate version
Related to #46
It can be selected for each type of program we want to build, and makes
sense for everything that is not a library.
We also tried to unify names and folders a bit more, even though there
certainly is more work to be done to be fully non-redundant.
Fixes#43
This allows us to build efficiently. CLI programs can now have their
own cmn.rs implementation, which we can test standalone with
`cargo test`.
The primary makefile currently just explicitly pulls in the type-*.yaml,
one day we could possibly put it into a loop.
Fixes#11
Previously, it would query the size from the wrong dict and obtain
the value 0 all the time. This would have made every upload fail with
`UploadSizeLimitExeeded`.
Now we obtain the actual size limit, and will ignore it if unset/0
for some reason.
Patch += 1
The delegate logic is implemented and seems sound.
It's somewhat funny that after all this back and forth, all we get
is a valid start position for the upload.
Make it cristal clear what the crate version means, and what version of
the documentation you are looking at. Also do this in the README file.
Assure that 'Google' is capitalized.
That way, crate names reveal exact inforamtion about the contained
API revision.
* crate version: code gen version
* +<revision> (build-metadata): exact version of API schema
Fixes#38
* renamed `*MethodsBuilder` type to `*Methods` type
* renamed `*CallBuilder` type to `*Call` type
* greatly simplified `doit()` signature if uploads are involved
* pass `auth` to upload helper
Layout the `ResumableUploadHelper` and implement the entire logic
with the mbuild renerator.
All that's left to be done is to implement the 'chunked upload' method.
The borrow checker helped me to prevent a bug as well.
* do not emit unused types. Sometimes though, rustc doesn't seem to
detect that attributses are actually used
* ToParts trait is used and implemented only when needed.
Linters are back to 'normal'.
Fixes#35
Now we use the DefaultDelegate as standin in case there is user-delgate.
That way, we save plenty of complexity as no additional
`if let Some(ref mut dlg) = delegate` is necesary.
Fixes#30
With one of the recent changes, the crate name was changed to be
different from the library name. However, there were still plenty of
places that would refer to the library name instead of the new crate
name.
That way, links in the README.md as well as index/index.html still
pointed to the old location.
* add method listing for various categories, like 'downloads' and
'uploads'
* add general information on how to do downloads and uploads using
various protocols
Fixes#28
This also includes documentation to state which methods actually support
media download, and how to achieve that.
Added TODO to not forget we should tell the user how to achieve these
kinds of things.
Fixes#21
* reserve_exact(X) where possible (params, multi-part-reader)
* `if let` used whereever possible to prevent duplicate checks
This increases the possible performance, and makes for more readable,
concise code.
Those were totally real, actually, and I am happy the borrow checker
exists !
Only one weirdness happened with RefCell<BorrowMut<C>>, but it could be
fixed by checking actual types using `let foo: () = something_nasty()`.
* outer frame of `MultiPartReader` to allow using it in `doit()`
* restructured `doit()` to get content-types right
There is more work to do, as it currently doesn't compile, nor
do we deal with our streams correctly.
But we are on a good way.
Now I just add a 'virtual' resource, which is called 'methods'.
The good thing about this is that it works well standalone, or
in conjunction with actual resources.
Also the system works with it just with minimal changes
Fixes#19
Now json errors are handled and delegated with the option to retry,
and all other values are just decoded according to plan.
For now, I brutally unwrap json values assuming this will work, because
it really should work.