We are parsing required scalar values and handle parse-errors correctly,
to the point were we make a simple, non-upload doit() call.
It shows that we seem to build invalid calls, for now,but that's nothing
we can't fix once the time is ripe.
Next goals will be related to finalizing the argument parsing code.
Fixes#60
That way, a single huge markdown file containing documentation for
commands and methods can be split up into multiple files for
individual inclusion in mkdocs.
It's done by a post-processor which is loaded by mako-render, providing
access to the entire context. Said processor may also drop results
altogether and thus prevent files to be written that have been split up
by it.
* allow to rename executables, for now just brute-force using a boolean
flag. If we have more binaries at some point, we might want to be more
elaborate.
* everything related to docopts functionality is now in the docopts
module.
Related to #45
* macro 'alias' was renamed to 'rename'
* fixed `cargo test` on main project
The latter pointed me to the serde issue, which would have made
everything fail when actually used to communicate with google servers.
* keywords are no longer than 20 characters, which is a restriction
cargo imposes
* don't use 'homepage' link in cargo.toml unless the homepage is
non-empty
* Added all publish-results to mark the respective crate version
Related to #46
It can be selected for each type of program we want to build, and makes
sense for everything that is not a library.
We also tried to unify names and folders a bit more, even though there
certainly is more work to be done to be fully non-redundant.
Fixes#43
This allows us to build efficiently. CLI programs can now have their
own cmn.rs implementation, which we can test standalone with
`cargo test`.
The primary makefile currently just explicitly pulls in the type-*.yaml,
one day we could possibly put it into a loop.
Fixes#11
This is the first of many changes to come.
We try to leverage our ability to merge multiple data source into one
to abstract away what we are actually doing, and of course, to allow
sharing the majority of the code, were applicable.
Previously, it would query the size from the wrong dict and obtain
the value 0 all the time. This would have made every upload fail with
`UploadSizeLimitExeeded`.
Now we obtain the actual size limit, and will ignore it if unset/0
for some reason.
Patch += 1
Must be `Option<Box<T>>` now, whereas a simple `Box<T>` worked
previously. Anyway, serde can't decode/encode Boxes yet, so
plus1 was removed from the list of APIs to build.
Related to #34
This also includes documentation to state which methods actually support
media download, and how to achieve that.
Added TODO to not forget we should tell the user how to achieve these
kinds of things.
Fixes#21
These 'methods' have no resources, and need slightly special handling.
This version at least makes the generator work, even though
it produces duplicates.
However, as it is so ugly, I'd rather consider to change it
substantially ... this feature should just come naturally.
This caused cargo on a case-sensitive file-system not to find the
cargo file, which made it to look upwards in the directory structure
to find the correctly named Cargo.toml fo the 'cmn' development
project.
It seems nearly nothing can be taken for granted ;).
It's best to just run against a big set of APIs and fix issues as they
arise though.
More flexibility means more maintenance, after all.
... it shows that the override I used previously won't work for `admin`.
Therefore we have to keep the actual value, instead of degenrating it.
Makes sense ... it's interesting how much one tends to hard-code things
to work just for a few cases, unless you opt in to see the whole picture
This file is completely generated, and allows us to easily bring in
new versions after each json update.
To make that work, we simple merge all data handed to mako-render,
inside of it. That way, we can put 'api/list' data in any yaml.
That way, we make retrieved tokens independent of the order scopes
were passed in. Additionally, we can pass any scopes, just in case
someone uses one token for multiple APIs.
Let's keep it flexible.
Just to have another, different set of api information to deal with,
and not accidentally hard-code things to work with youtube only.
Prepared dealing with media uploads, and it turns out to be best to
adjust the 'doit()' to take the respective type parameter.
We also have to think about downloads, like the ones for google drive,
which requires custom query parameters.
This includes descriptions, of course, and generally seems to look
quite neat. For now, we brutally consume all input to own it,
but in future we might be able to put in Borrow to support them all.
Everything we have, feature wise, is now documented in a first version
at least.
We shall keep this uptodate with what we are implementing, which also
helps figuring out a good api.
That way, we have a common library to pull in from the main repository,
and a space for testing new code (in a partial implementation).
Next there will be generated object structures.