HomeGuidesRecipesAPI EndpointsRelease NotesCommunity
Log In

Uploading OBO and OWL ontologies

Vyasa's Layar API can automatically recognize OBO and OWL files and categorize them as ontologies, making the task of uploading them as simple as uploading any other document! As always we are going to start by instantiating the proper API class.

Instantiating the SourceDocumentApi

We are going to upload our OBO or OWL ontology the same way we would upload any other document, which means we will be using the same method! Start by instantiating the SourceDocumentApi and setting up our data providers. Make sure to only include your own instance as a data provider as that is the provider you will have the proper permissions to edit.

documents = layar_api.SourceDocumentApi(client)
providers = '***YOUR_INSTANCE_NAME***.vyasa.com'

Find Your Ontology

You will need to know either the absolute or relative file path for the ontology you want to upload on your machine. for example, if you have the file stored in the same directory as your working python file the relative path could be ./my_ontology.obo. If you are unsure of how to find the path for your ontology, you can try using the trick we went over in Setting Up Your Environment to use your terminal to find the file path.
Once you have your absolute or relative file path add it to a variable as below.

my_ontology = './my_ontology.obo'

Calling The create_document Method

You are now ready to upload your ontology! Make sure you set your x_vyasa_data_providers, file, and cortex_document_type parameters as well as an optional name parameter as shown below.

upload_ontology = documents.create_document(x_vyasa_data_providers=providers, file=my_ontology, cortex_document_type='ONTOLOGY', name='My Ontology')
# Print the resulting Layar object (optional)
pprint(upload_ontology)