The tale
Once upon a time in the beautiful kingdom of Datamania lived a prince named Prince Fairhair. Though he was gentle as few, and good looking too, his father would not let him choose the love of his life on his own. No, he was destined to marry a woman from the neighbouring kingdom. He did not even know her name, only that she was referred to as My Fair Lady.
Before the father of My Fair Lady could accept the marriage, he had a quest for Prince Fairhair. Only by fulfilling the quest, would he be able to marry the princess. His quest was to find out how to turn water into gold. A quest that would require gathering loads of data chests and look for clues that could lead to the recipe.
Luckily, Prince Fairhair was not alone in his quest. One of the castle wings housed a number of wizards who could help him decrypt and investigate the data chests. However, it was impossible for the data wizards to go and hunt for data themselves. Thus to assist them, a huge number of elves were trained to look for data chests. The elves had read books, journals, comics and even poetry to know where to look. The quest was about to begin, and the elves went hunting for data chests all over the kingdom of Datamania and in empires far far away.
The truth
The FAIR principles were first published in 2016. They contain guidelines for good data management practice that aim at making data FAIR: Findable, Accessible, Interoperable and Reusable. Each letter refers to a list of principles with a total of 15 principles altogether.
Although they originate from the life sciences, the principles can be – and have been – applied within other research disciplines, including the humanities and the social sciences. Since their publication, the European Union as well as individual funders and universities have declared their support for and approval of the FAIR principles. This spans from the creation of data management tools and infrastructures to defining policies for data handling. Some implementations stick closely to the original definitions, while others are inspired by the spirit of the FAIR principles.
A fundamental prerequisite for understanding FAIR is to know that both humans and machines are intended as digesters of data. This will lead to an ecosystem that is fast to respond to change and automatically adapts to new findings or changes. That is the reason for focusing on standards for the data, identification mechanisms,
availability of data etc. Secondly, the FAIR principles apply to both data and its metadata – i.e. records about data sets. That is why the term “(meta)data” is stated in the principles. Thirdly, the principles are not only about open data. You can work in a FAIR manner with data that is not intended for public availability.
The FAIR principles do not represent a quality standard that you can use to evaluate tools, data, policies etc. This would soon make the principles out-of-date and inapplicable across research disciplines. The implementation of FAIR can be a gradual and systematic adaptation of new work routines or a huge leap where you replace one type of infrastructure with another. The implementation of the principles should be adapted to each research area, meaning that each community will make the principles work in their respective contexts.