Skip to content

A Rust implementation of a Wikibase REST API client

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE2
MIT
LICENSE-MIT
Notifications You must be signed in to change notification settings

magnusmanske/wikibase_rest_api

Repository files navigation

Crates.io Crates.io License License Codacy Badge AvgCCN 2.1 Codecov 96.01%

This Rust crate provides a REST API for Wikibase. It is based on the Wikibase REST API. It works on any MediaWiki installation with the Wikibase extension and an enabled Wikibase REST API.

Usage

See also the examples.

// Create an API (use the Wikidata API shortcut)
let api = RestApi::wikidata()?;

// Use Q42 (Douglas Adams) as an example item
let id = EntityId::new("Q42")?;

// Get the label and sitelink of Q42
let q42_label_en = Label::get(&id, "en", &api).await?.value().to_owned();
let q42_sitelink = Sitelink::get(&id, "enwiki", &api).await?.title().to_owned();
println!("Q42 '{q42_label_en}' => [[enwiki:{q42_sitelink}]]");

// Create a new item
let mut item = Item::default();
item.labels_mut()
    .insert(LanguageString::new("en", "My label"));
item.statements_mut()
    .insert(Statement::new_string("P31", "Q42"));
let item: Item = item.post(&api).await.unwrap();
println!("Created new item {}", item.id());

// Load multiple entities concurrently
let entity_ids = [
    "Q42", "Q1", "Q2", "Q3", "Q4", "Q5", "Q6", "Q7", "Q8", "Q9", "P214",
]
.iter()
.map(|id| EntityId::new(*id))
.collect::<Result<Vec<_>, RestApiError>>()?;

// A container will manage the concurrent loading of entities.
let api = Arc::new(api);
let entity_container = EntityContainer::builder()
    .api(api)
    .max_concurrent(50)
    .build()?;
entity_container.load(&entity_ids).await?;
let q42 = entity_container
    .items()
    .read()
    .await
    .get("Q42")
    .unwrap()
    .to_owned();
let q42_label_en = q42.labels().get_lang("en").unwrap();
println!("Q42 label[en]: {q42_label_en}");

Implemented REST API actions

items

  • post
  • get
  • patch

properties

  • post
  • get
  • patch

sitelinks

  • get item_id
  • patch
  • get itemid/sitelink_id
  • put itemid/sitelink_id
  • delete itemid/sitelink_id

labels

  • get item_id
  • patch item_id
  • get property_id
  • patch property_id
  • get item_id/language_code
  • put item_id/language_code
  • delete item_id/language_code
  • get item_id/language_code with fallback language
  • get property_id/language_code
  • put property_id/language_code
  • delete property_id/language_code
  • get property_id/language_code with fallback language

descriptions

  • get item_id
  • patch item_id
  • get property_id
  • patch property_id
  • get item_id/language_code
  • put item_id/language_code
  • delete item_id/language_code
  • get item_id/language_code with fallback language
  • get property_id/language_code
  • put property_id/language_code
  • delete property_id/language_code
  • get property_id/language_code with fallback language

aliases

  • get item_id
  • patch item_id
  • get property_id
  • patch property_id
  • get item_id/language_code
  • post item_id/language_code
  • get property_id/language_code
  • post property_id/language_code

statements

  • get item_id
  • post item_id
  • get item_id/statement_id as get statement_id
  • put item_id/statement_id as put statement_id
  • patch item_id/statement_id as patch statement_id
  • delete item_id/statement_id as delete statement_id
  • get property_id
  • post property_id
  • get property_id/statement_id as get statement_id
  • put property_id/statement_id as put statement_id
  • patch property_id/statement_id as patch statement_id
  • delete property_id/statement_id as delete statement_id
  • get statement_id
  • put statement_id
  • patch statement_id
  • delete statement_id

misc

  • /openapi.json
  • /property-data-types
  • seach items (not implemented in wikibase yet/v0?)

Developer notes

TODO

  • Maxlag/rate limits?

Code analysis is run via analysis.sh.

Code coverage

cargo install cargo-tarpaulin # Once
cargo tarpaulin -o html

Lizard

Lizard is a simple code analyzer, giving cyclomatic complexity etc. https://github.com/terryyin/lizard

lizard src -C 7 -V -L 40

Analysis

Run rust-code-analysis.py (requires rust-code-analysis-cli to be installed) to generate analysis.tab. This contains many metrics on code complexity and quality.

./rust-code-analysis.py

Tarpaulin

cargo tarpaulin -o html

grcov

grcov

Miri

Installation and usage: https://github.com/rust-lang/miri

cargo +nightly miri test

About

A Rust implementation of a Wikibase REST API client

Resources

License

Apache-2.0, MIT licenses found

Licenses found

Apache-2.0
LICENSE-APACHE2
MIT
LICENSE-MIT

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages