Biodiversity Information Science and Standards :
Standards
|
Corresponding author: Steven J. Baskauf (steve.baskauf@vanderbilt.edu)
Academic editor: Elycia Wallis
Received: 29 Aug 2022 | Accepted: 15 Dec 2022 | Published: 04 Jan 2023
© 2023 Steven Baskauf, Jennifer Girón Duque, Matthew Nielsen, Neil Cobb, Randy Singer, Katja Seltmann, Zachary Kachian, Mervin Pérez, Donat Agosti, Anna Klompen
This is an open access article distributed under the terms of the Creative Commons Attribution License (CC BY 4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
Citation:
Baskauf SJ, Girón Duque JC, Nielsen M, Cobb NS, Singer R, Seltmann KC, Kachian Z, Pérez M, Agosti D, Klompen AML (2023) Implementation Experience Report for Controlled Vocabularies Used with the Audubon Core Terms subjectPart and subjectOrientation. Biodiversity Information Science and Standards 7: e94188. https://doi.org/10.3897/biss.7.94188
|
|
The Audubon Core vocabulary terms subjectPart and subjectOrientation are used to describe the depicted part of an organism and its orientation in an image. We describe the criteria and process for developing controlled vocabularies for these two terms. The vocabularies take the form of Simple Knowledge Organization System (SKOS) concept schemes and their terms are categorized using SKOS collections to allow users to select from particular sets of values appropriate for particular organism groups and their parts. We also report the results of implementation testing used to determine the usability of the proposed terms with actual images of living organisms and preserved specimens.
Biodiversity Information Standards (TDWG), multimedia metadata, two-dimensional still images
Publications in biology, especially those related to biodiversity, generally include images to portray habitus and morphological structures of the organisms that are the subject of those works. An increasing number of repositories are making available large collections of digital images of organisms and their parts, drawing on sources ranging from natural history collection digitization to citizen science platforms. These images are increasingly used as a source for many forms of research, including morphological measurements, identification of individual organisms, and automated taxonomic identification. Especially as repositories become larger and research more automated, usefulness of these images for research will increase greatly because of the ability to describe the depicted part of an organism and its orientation using a controlled vocabulary.
Morphbank :: Biological Imaging*
The Audubon Core Multimedia Resources Metadata Schema, often referred to simply as "Audubon Core" and abbreviated as AC, is a set of metadata vocabularies for describing biodiversity-related multimedia resources and collections (
Soon after the formation of the Audubon Core Maintenance Group in 2018, it chartered a Views Task Group*
As a coordinated addition to Audubon Core, the Views Task Group has followed the process outlined in Section 4 of the TDWG Vocabulary Maintenance Specification (VMS;
After acceptance of the Task Group charter, the group began to hold regular meetings. The work carried out at those meetings was documented in a series of meeting notes*
The initial task of the group was to solicit use cases from the community. Responses were received from image producers and consumers (including biodiversity researchers) and representatives of image aggregators. Contributors followed a template that was provided and the submitted use cases were organized by categories*
During the course of the Task Group's work, it became apparent that subjectPart designations could apply to only part of an image if the image contained multiple parts or even multiple organisms. Standardizing the description of regions of interest (ROIs) was not in the scope of this Task Group, but the eventual addition of an ROI vocabulary to Audubon Core *
Following the generation of candidate requirements, the Task Group reviewed existing approaches to describing subject parts and orientations, and identified ontologies that could be used to define parts and orientations unambiguously. Then work began on the actual construction of the two controlled vocabularies in the form of Simple Knowledge Organization System (SKOS) concept schemes (
The concept schemes were relatively unstructured, except for a few cases where concepts had a broader relationship to other concepts (for example, "right" orientation had the broader concept "lateral" orientation; Table
Term Name acorient:r0004 |
|
Term IRI |
|
Modified |
2022-01-01 |
Term version IRI |
|
Label |
right side |
Definition |
view of the right side of a whole bilaterally symmetric organism |
Definition derived from: |
|
Controlled value |
right |
Has broader concept |
acorient:r0003 |
Type |
Concept |
During the construction of the controlled vocabularies, it became apparent that although satisfying some of the candidate requirements would be desirable, doing so would not be practical. In some cases, meeting a requirement would have made the vocabularies too complex for most users. In other cases, it would have made some of the concepts so granular that few users would use them and the size of the vocabularies would be impractically large. As a result, a large number of the candidate requirements were dropped prior to the creation of the final list of requirements (i.e., the final version of the VMS Feature Report). There were two cases where the candidate requirements from the Task Group helped drive other developments within Audubon Core:
The final requirement list is presented in Appendix A. Of the seventeen potential requirements derived from the submitted use cases, seven were included in the final requirements.
During the development process, ac:subjectPart and ac:subjectOrientation were added as custom metadata fields in Zenodo. Controlled string values appropriate for insects from the preliminary vocabularies were used to categorize images of fly specimens when they were submitted. Fig.
Specimen image https://zenodo.org/record/6084051 showing an anterior view of the head of a fly, Lachnocorynus stenocephalus. Image used under a CC BY license. Dikow, Torsten. (2022). Lachnocorynus stenocephalus Boschert and Dikow, 2021, head, anterior. Zenodo. https://doi.org/10.5281/zenodo.6084051
After the completion of the draft controlled vocabularies, the Task Group began planning for implementation testing by identifying potential testers and creating an implementation testing guide (Suppl. material
The group recognized that not every tester would use the vocabularies in the same way, so the guide described testing by manual entry, machine-guided entry, and machine processing. The guide also included the questions that would be asked on the feedback form at the conclusion of testing so that the participants would have a better idea of what to consider while carrying out the testing. The vocabularies were also made available to the testers as CSV tables*
After potential testers were identified, the Task Group held an optional workshop that would give the testers an opportunity to try applying the terms to a small number of images while they could ask clarifying questions to the workshop presenters. Testers then carried out testing on a larger sample of images.
After completing the testing, implementers submitted feedback using a Google form (Suppl. material
There were five institutions participating in the implementation testing (Table
Organization |
Taxonomic coverage |
Number of images in test |
Image type |
Testing type |
Field Museum (Field) |
plants |
33 |
live organisms, digitized specimens |
manual entry |
Dept. of Ecology and Evolutionary Biology, University of Kansas (Kansas) |
marine invertebrates |
14 |
live organisms |
manual entry |
Bioimages (Bioimages) |
seed plants |
25 |
live organisms |
manual entry, machine processing |
UC Santa Barbara, Cheadle Center for Biodiversity and Ecological Restoration (California) |
Anthophila |
21 |
digitized specimens |
manual entry |
Universidad de San Carlos de Guatemala -CUNZAC- (Guatemala) |
plants |
8 |
live organisms |
manual entry |
The detailed testing results are presented in Appendix B. All testers carried out testing manually by having a human refer to the human-readable lists of concepts or CSV tables, then entering controlled value strings into a spreadsheet. All users selected orientation concepts from a collection appropriate for a particular part. Some users also used the collections of parts appropriate for organism groups, although this wasn't generally necessary for groups whose images were only part of one group.
The testers applied the controlled vocabularies to a variety of types of organisms with several kinds of photographing circumstances: preserved specimens with fixed orientations and live organisms with uncontrolled and controlled orientations (Table
Generally, the problems that users encountered had less to do with the vocabularies themselves, but more with difficulties caused when an image was not restricted to a single organism, part, or orientation. In theory, problems related to inclusion of multiple organisms or parts could be addressed by defining regions of interest within the image and then applying the terms to those specific regions, but without machine assistance to demarcate those regions, record their bounds, and associate the values with those regions, that solution wasn't practical for a human assessing an entire image. This assistance could range from a simple manual "click and drag" tool for demarcating the regions to a fully automated system for detecting parts and orientations. Two of the testers noted that sometimes both upper side and lower side orientations are purposefully included in the same specimen or image, making this problem a frequent occurrence for photographs that include plant leaves. The problem of uncontrolled orientation was noted by two testers who photographed live organisms. Due to difficulties of photographing organisms that were not in a fixed position, or organisms whose parts pointed in multiple directions, the chosen orientation represented a "best estimate". One tester noted that users might have trouble selecting an appropriate part if they lacked the technical expertise to do so.
None of the testers used machine-guided selection. That is not surprising since it would require development of new software or customization of existing software to make use of the machine-readable SKOS. However that is likely to be an important use case in the future after adoption of the vocabularies.
One implementer used machine processing to assign values based on existing text descriptions of the view. The process and results were described as follows: "For the automated conversion, I queried the Bioimages image dataset to create a spreadsheet of the view descriptor IRIs we use along with the associated part and 'view' labels. Each IRI was mapped in a table to an ac:subjectPartLiteral and ac:subjectOrientationLiteral value appropriate for that IRI. In some cases there wasn't a specific subject orientation, so I used 'unspecifiedOrientation'. See https://github.com/baskaufs/msc/blob/master/bioimages_views/stdviews_table.csv for the mappings. I then queried the database for the 25 images that were used for the human test and used the mapping table to assign AC controlled value strings for each of the images based on its descriptor IRI value. The values derived by a human were compared to those generated by the automated mapping. In all images, the subjectParts corresponded completely. Where a specific subject orientation could be assigned via mapping, that orientation agreed with the human assessment, but there were many cases where the mapping generated an 'unspecifiedOrientation' value when the human was able to make an assignment. This is just a limitation of our existing system to capture complete information about the orientation. See https://github.com/baskaufs/msc/blob/master/bioimages_views/mapping_test.csv for the results. The Python script to do the querying and mapping is at https://github.com/baskaufs/msc/blob/master/bioimages_views/bioimages_views.ipynb ."
This test was probably more successful than would be likely for a random provider since the Bioimages images were organized using the same Morphbank view categories that influenced the construction of the controlled vocabularies. Nevertheless, it showed that automated conversion was possible, although an "unspecifiedOrientation" assignment is likely to be the result when existing view descriptors cannot be perfectly mapped to the controlled vocabularies.
There were three requirements (Appendix A) that testers did not implement:
1.4 Specify multiple parts in an image by applying subjectPart concepts to Regions of Interest within an image. (2-FILTER-1)
1.5 Distinguish between single and aggregate parts (e.g., one vs. several leaves) by applying multiple subjectPart concepts of the same type to Regions of Interest within a single image. (7-CLARITY-2)
2.2 For some organism groups, filter orientations so that selection is only possible if the feature is visible for a particular subject part. (8-ORIENT-1)
Requirements 1.4 and 1.5 depend on the implementation of Regions of Interest, which is not a feature of these vocabularies on their own, but rather a separate technology that none of the implementers had (yet) implemented. So the inability to implement was not a deficiency of the vocabularies per se.
Requirement 2.2 depends on software development that was beyond the scope of what was expected of the testers. So again, the inability to implement during testing does not indicate a deficiency of the vocabularies themselves. Although the structural design of grouping concepts for filtering purposes was not used in a machine-assisted way (i.e., through software consuming the vocabularies as JSON-LD or tabular data), the division into SKOS collections was used to generate the human-readable documents to which most implementers referred when selecting concepts.
The testing included some organism groups that were not included in the original set: cnidarians and bryophytes.
The testing suggested that the existing terms were not adequate when applied to cnidarians. Although "lateral side" was an appropriate description, "dorsal side" and "ventral side" did not make sense when applied to non-bilaterally symmetric animals. Therefore, two additional orientations: "oral side" and "aboral side" were added to make the vocabularies useable with groups like cnidarians and echinoderms.
The existing terms developed for seed plants were not adequate for all bryophyte images. During the testing period, additional terms were proposed for bryophytes, ferns, and fungi. Since there was insufficient time for these new terms to be tested in order for them to be included in the initial submission, they were designated as candidate terms to be considered as future additions to the vocabularies*
One result of the discussion about using the vocabularies with these other groups was changing the subjectOrientation terms for "adaxial side" and "abaxial side" to "upper side" and "lower side". That made them more broadly usable in organism groups that did not have a clearly defined central axis. It also made the terms easier for human users to select without error, since the strings "adaxial" and "abaxial" are more difficult to distinguish due to their visual similarity.
Controlled vocabularies are necessary for standardizing the way we manage and process information. By developing these views vocabularies, we provide a means to describe the content of images by manual entry, machine-guided entry, and machine processing. Test implementers were able to use the terms proposed without major issues. Although the controlled vocabularies for subjectPart and subjectOrientation were not able to handle every scenario imagined in the original request for use cases, they fulfilled most of the final requirements established by the Task Group (the "Feature Report" required by the SDS).
The usefulness of these controlled vocabularies would be increased if they were incorporated within tools for manually demarcating or automatically detecting regions of interest that correspond to subjectParts depicted in an image. We hope that developers will take advantage of this opportunity to associate parts of images with machine-readable metadata for describing what is depicted in those parts.
Moving forward, when the vocabularies are adopted and broadly used, we expect that these will expand over time to include subjectParts that were not tested in this implementation. Through the TDWG term change process (
One reason for using SKOS to describe the controlled vocabularies is that it provides a mechanism for enriching them to make them more broadly usable. As we noted, SKOS collections have been used to group the concept terms in meaningful ways and skos:prefLabel has been used to add labels and definitions in Spanish. In the future, labels and definitions in other languages may be added as translations are completed and skos:altLabel may be used to document alternative labels for which users may search.
Overall, the results of implementation testing demonstrated that the vocabularies are ready for adoption and inclusion as part of the Audubon Core standard.
Note: source use cases*
Subject part
1 Categorization
1.1 Subject part values are grouped appropriately for broad categories of organisms (e.g., woody angiosperms, insects). Selecting a SKOS Collection will allow a user to find a group of part concepts appropriate for a particular category of organisms. (1-CATEGORIZE-1)
1.2 Concepts are linked to well-known ontologies to clarify definitions and standardize labels. However, the actual concepts are TDWG-adopted terms, providing stability that might not exist in the source ontologies. (6-ANATOMY-1) Ontologies used were the Biological Spatial Ontology (BSPO)*
1.3 Concepts allow for distinguishing between sexes (if multimorphic) by selecting narrower categories of subjectPart. (added during discussion)
1.4 Specify multiple parts in an image by applying subjectPart concepts to Regions of Interest within an image. (2-FILTER-1)
1.5 Distinguish between single and aggregate parts (e.g. one vs. several leaves) by applying multiple subjectPart concepts of the same type to Regions of Interest within a single image. (7-CLARITY-2)
2 Relationship between part and orientation
2.1 Determine what orientations are appropriate for subject parts other than whole organism. (3-MEASURE-4, 1-CATEGORIZE-2)
2.2 For some organism groups, filter orientations so that selection is only possible if the feature is visible for a particular subject part. (8-ORIENT-1)
Results from specific testers to questions on the feedback form are included in Table
Organiz. | Testing details | Difficulties in selecting concepts |
Field | We used photos of live plants as well as herbarium specimens for this exercise. We had 3 people (2 data, 1 botanist) view a set of 33 images and independently assign values for subjectPartLiteral and subjectOrientationLiteral. We then compared the results and recorded any questions that arose. |
This happened frequently. Most of our images were of whole organisms, without clearly defined regions of interest. In the case of herbarium specimens, for example, it is oftentimes the goal to articulate the plant in a way that shows both the adaxial and abaxial sides of the leaf. There were many times when a part or orientation of a plant was present in the image but may have not been the focus of the image or maybe was only partially visible so we were unsure if those parts/orientations should be recorded. Also there were times when clarity of the image makes it difficult to identify a part or orientation. It was sometimes hard for the "data" people in the team to identify SubjectPart and needed botanist's expertise. Missing concepts: For subjectPartLiteral: Suggest adding "petiole" and "stipules" (for plant's leaf); maybe useful to include the maturity of reproductive parts e.g., "bud" and "flower"; "trunk" should be added since sometimes people specifically take a photo of trunk to show certain characteristics; the term "bark" may not be useful if we have "twig" and "trunk"; technically, when inflorescence becomes a fruit, it is called "infructescence"; also the term "peduncle" came up - the botanist indicated it as a helpful term to include. For subjectOrientationLiteral, maybe add "transverse" and "longitudinal" for fruits. Often times, people take a photo of cut fruit. |
Kansas | The types of images I chose were all live images, mainly from a marine field station and also images through a microscope from my lab, which works with live cnidarians. The field station images included a mix of specimens that were in the field (i.e. jellyfish in the water), in a glass jar or container, or specimens through a microscope. I am trained as a marine invertebrate and evolutionary biologist, so the types of images I would be taking would be for documenting species that we may bring back to the lab for various types of assays or sequencing experiments. For sequencing in particular, it is useful to have images of the specimens used, but getting "clean" images with easily definable features or single organisms can be difficult. In selecting images, I wanted to get a range of body types and phyla, both ones that were easily identifiable in terms of orientation as well as unconventional images from the field that may be trickier to categorize. This is why I went over the 5-10 image range for manual testing. |
For radially symmetric animals or animals with multiple individuals (e.g. colonial animals, groups of animals), it was tricky to determine the most appropriate concepts. For multiple individuals I selected a focal "individual" centered on the image, but of course this is subjective to each viewer. Missing concepts: Many of the animals that I work with are radially symmetric (or similar - e.g pentameral symmetry of sea stars). While dorsal, ventral, and lateral were sufficient for most tasks, for cnidarians (and ctenophophores) their symmetry is described as oral-aboral, which would be more appropriate concepts for orientation. Insufficient granularity: Colonial animals could perhaps require a specific identification, or at least a way to distinguish from other organism types, since "entire Organism" could mean entire colony or entire individual of the colony. Cnidarians or non-bilateral animals in general may also require additional terms. |
Bioimages | I applied the controlled values to a broad range of plant parts across woody angiosperms, herbaceous angiosperms, and gymnosperms. I examined the image and selected appropriate values from the spreadsheet of available values. In some cases, I referred to the lists of orientations appropriate for parts to make sure that the orientation I was selecting was appropriate for the part. I copied the selected controlled value from the spreadsheet of values to the spreadsheet where I was recording the test results, along with the image filename and GUID (IRI) for the image. Since I was working alone, I didn't have anyone to check my work. However, I did crosscheck using the automated mapping. | Photos of leaves intentionally included both adaxial and abaxial sides to show the difference in surface characteristics. I chose the most prominent side, but when leaf margins were photographed, there wasn't really a predominant side. See http://bioimages.vanderbilt.edu/baskauf/41902 for an example. The orientation for inflorescences was sometimes difficult to determine when multiple infloresences were visible (e.g. clusters of catkins). In photos of dehiscing fruit, there wasn't really an appropriate view -- what would be the lateral side of the fruit was on the outside and not visible when the fruit interior was photographed. See http://bioimages.vanderbilt.edu/baskauf/24261 for an example. It was difficult to determine whether images of juvenile herbaceous plants were apical or lateral because they were usually photographed at an angle. The orientation of clusters of male cones was a similar situation to catkins. Usually the images did not include just a single cone and the cones were sticking out at differing orientations. In case where leaves were not laminate (for example pine needles and rounded leaves (for example sedum: http://bioimages.vanderbilt.edu/baskauf/21935), it wasn't clear which side of the leaf was adaxial or abaxial. For plants where the leaves emerged vertically in whorls (e.g. yucca http://bioimages.vanderbilt.edu/baskauf/14597) it is difficult to photograph one side of a single leaf. For some photos of a part of a plant part (trunk of a whole tree, internal parts of a flower), it was difficult to specify the orientation because the whole part was not visible and the image was taken at an angle to the feature that wasn't apical or lateral. In some cases where the inflorescence consisted of a single flower, it wasn't clear whether the part should be "flower" or "inflorescence". See for example http://bioimages.vanderbilt.edu/baskauf/50597 (For specific details, see Table |
California | The 1,243,540 are categorized in a general way to facilitate searching both by orientation and part. https://library.big-bee.net/portal/imagelib/search.php |
It was generally easy, but difficult when labels are present or the only part of a specimen imaged. Insufficient granularity: Hymenoptera technically do not have a thorax and abdomen, these are actually mesosoma and metasoma due to the place of constriction of the waist. This is pretty technical difference and thorax/abdomen can be considered general terms for before and after the constriction. (For specific details, see Table |
Guatemala | I tried to contribute with information for bryophytes, a group of plant that wasn't included at the first time. | There is still some work needed to complete parts and orientations for bryophytes, however, the data proposed seem to be functional for this group of plants. |
Specific comments about test images from the Bioimages test are included in Table
Detailed results from Bioimages testing. The image_identifier is appended to a base IRI of http://bioimages.vanderbilt.edu/.
image_identifier | ac:subjectPartLiteral |
ac:subjectOrien- tationLiteral no |
notes |
baskauf/25638 | entireOrganism | lateral | |
baskauf/12625 | entireOrganism | lateral | The part is more like "trunk" than whole organism. |
baskauf/41910 | bark | lateral | |
baskauf/63779 | twig | lateral | |
baskauf/41905 | leaf | adaxial | Hard to remember that "adaxial" is the upper leaf surface. |
baskauf/41902 | leaf | adaxial | About half of the image is adaxial of one leaf and abaxial of another. |
baskauf/41887 | leaf | adaxial | Image shows many leaves, all of the adaxial side, but probably needs to use ROIs. |
baskauf/42310 | inflorescence | lateral | Orientation not clear as there are various inflorescences sticking out. |
baskauf/50597 | inflorescence | lateral | Not sure if I should call this a flower or inflorescence |
baskauf/50743 | flower | apical | |
baskauf/50741 | flower | apical | Just part of the flower |
baskauf/41891 | fruit | lateral | |
baskauf/24261 | fruit | lateral | Not really lateral, the fruit is dehiscing |
baskauf/65236 | stem | lateral | |
baskauf/57859 | entireOrganism | apical | Several plants, some more lateral than apical |
baskauf/27473 | flower | apical | |
baskauf/61716 | leaf | abaxial | not sure how you can tell the orientation for needles |
baskauf/51363 | femaleCone | lateral | |
baskauf/51365 | maleCone | lateral | several cones, a variety of orientations |
baskauf/33496 | leaf | adaxial | |
baskauf/33504 | inflorescence | lateral | |
thomas/0627-01-01 | entireOrganism | lateral | |
baskauf/21935 | leaf | abaxial | difficult to say if this is leaf or stem and the leaves are not really laminate, so orientation is difficult |
baskauf/14597 | leaf | adaxial | because of the whorled nature of this, both abaxial and adaxial orientations are present |
Specific comments about test images from the California test are included in Table
image |
ac:subjectPartLiteral |
ac:subjectOrien-tationLiteral |
Notes |
Image description |
hindwing |
dorsal |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
||
https://serv.biokic.asu.edu/imglib/ecdysis/UCSB_IZC/UCSB-IZC00036/UCSB-IZC00036938_1633713553.jpg |
entire organism |
dorsal |
no way to recognize label which is part of a specimen |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
entire organism |
anterior |
As part of a series of 2D images that go around a specimen it is not a specific orientation but really between orientations. |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
|
https://monarch.calacademy.org/mnt/target-images/CASTYPE/00001/CASTYPE1503_h.jpg |
head |
anterior |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
|
https://ids.si.edu/ids/deliveryService/id/ark:/65665/m3642ca13b63774a5c9f907938471ad74f/1200 |
entire organism |
lateral |
Also includes a good image of the wing |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
https://monarch.calacademy.org/mnt/target-images/CASTYPE/00001/CASTYPE1506_label.jpg |
N/A |
N/A |
Nothing applies. |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
https://monarch.calacademy.org/mnt/target-images/CASTYPE/00001/CASTYPE1506_l.jpg |
entire organism |
lateral |
Also includes a good image of the wing |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
entire organism |
ventral |
As part of a series of 2D images that go around a specimen it is not a specific orientation but really between orientations. |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=3667 |
|
entire organism |
lateral |
Also includes a good image of the wing |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=6324 |
|
entire organism |
dorsal |
no way to recognize label which is part of a specimen |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=6324 |
|
entire organism |
lateral |
As part of a series of 2D images that go around a specimen it is not a specific orientation but really between orientations. |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=6324 |
|
abdomen |
dorsal |
As part of a series of 2D images that go around a specimen it is not a specific orientation but really between orientations. |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=6324 |
|
https://ids.si.edu/ids/deliveryService/id/ark:/65665/m3be3a15601b514314a5737aa195fb9d36/1200 |
entire organism |
dorsal |
Two specimens in single image |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/collections/individual/index.php?occid=1673075 |
https://ids.si.edu/ids/deliveryService/id/ark:/65665/m377be30404e8546a988597bf6689bebc0/1200 |
N/A |
N/A |
Nothing applies. |
Bee images from the Big-Bee project: https://library.big-bee.net/portal/collections/individual/index.php?occid=1673075 |
forewing |
unspecifiedOrientation |
Bee images from Big-Bee project: https://library.big-bee.net/portal/taxa/index.php?taxon=17374 |
||
entire organism |
ventral |
Bee images from the Big-Bee project |
||
https://mczbase.mcz.harvard.edu/specimen_images/entomology/paleo/large/PALE-7514_Apis_henshawi.jpg |
entire organism |
ventral |
Bee images from the Big-Bee project |
|
thorax |
lateral |
In Hymenoptera, this is called a mesosoma because one segment of the abdomen is part of the structure. |
Bee images from the Big-Bee project |
|
leg |
unspecifiedOrientation |
Bee images from the Big-Bee project |
||
leg |
unspecifiedOrientation |
Bee images from the Big-Bee project |
||
https://serv.biokic.asu.edu/imglib/ecdysis/UCSB_IZC/UCSB-IZC00010/UCSB-IZC00010221.jpg |
entire organism |
dorsal |
no way to recognize label which is part of a specimen |
Bee images from the Big-Bee project |
The Task Group thanks the Audubon Core Maintenance Group for their oversight and support during the vocabulary development process. Martin Stein was part of the original Task Group but was not able to continue working with the group. Torsten Dikow provided valuable feedback and examples during the vocabulary development process. Tomomi Suwa participated in the implementation testing on behalf of the Field Museum. John Oswald engaged the Task Group with interesting ideas from his work categorizing images that did not end up being incorporated as features of the vocabularies. David Fichtmueller, Sharif Islam, and Doug Palmer provided useful comments for improving the paper through their reviews.
SJB, JCGD, and MN wrote the manuscript and were core task group members. NSC and RS were core task group members. KCS, ZK, MP, DA, and AMLK were implementation testers. DA also participated in the task group during the development phase.
These notes were provided to test implementers as a guide to carrying out the testing.
This document is an export of the questions to which implementers responded after testing.
Morphbank :: Biological Imaging (https://www.morphbank.net/, 15 July 2022). Florida State University, Department of Scientific Computing, Tallahassee, FL 32306-4026 USA.
Proposal to "Revise ac:subjectOrientation and ac:subjectPart and add ac:subjectOrientationLiteral and ac:subjectPartLiteral" https://github.com/tdwg/ac/issues/195