The LinkedUp Toolbox is for open data competition organisers. It can be applied by those organising academic or industry competitions.

The toolbox will contain the following ‘tools’:

Competition Framework

When we started planning for our Competitions, we could not find such a thing as a best practice for designing a challenge on open educational data. Each challenge or competition has its own context, peculiarities and requirements of course. What we provide here is a set of lessons and guidelines learnt during our experience including a decision tree that may be useful to decide upon the structure of the competition’s prizes.


Prize Decision Tree [pdf | png]

Evaluation Framework

If it is a competition there must be a winner. The question is then how do you determine the applications that deserve the most credit, and what are the criteria and indicators that build up an unified framework for assessing the submissions? In this section we will share a systematic and evidence-based approach of deriving the criteria using indicators from the LinkedUp Evaluation Framework.


Evaluation Framework Q&A [text file]
Evaluation Framework Lessons Learnt [text file]
Evaluation Framework image [png]

Guidance schedule

Organising a competition means thinking ahead and to carefully plan when things need to be done. For example, when should one officially launch the competition and what does ‘launching’ mean? At what point does the submission system need to be ready? What is the best time to invite people for the evaluation committee and how much time should one reserve for the review process? The guidance schedule will be a draft timeline with milestones, checklists and promotional activities that can be adapted to your competition’s needs.


Competition Guidance Schedule [pdf]


How data fits into your competition depends on many factors. First the competition might rely on specific datasets, that need to be used, or the same data should be used for entries to be comparable… Even if that’s not the case, collecting data for competitors to use can help get the competition off the ground. The challenge here is to put competitors in the best possible position to show off their abilities, through avoiding for data identification, collection, access and manipulation to become an entry barrier.


Open data support Decision Tree [pdf | png]

Promotion methodology

To ensure you receive entries to your open data competition you will need to promote it to your key audience and beyond. This section offers supports a successful promotion strategy through lists of commonly used competition-related terms in academic and industry-oriented settings, and mind maps of promotion approaches and audiences.


Commonly used terms [text file]
Mindmap of promotion approaches [online | pdf]
Mindmap of audiences [online | pdf]

Legal and IPR

This section aims to provide an effective summary of the legal implications challenge designers may face when creating their competitions. By giving a clear overview of the relevant legislation regarding for example challenge design, handling contributions, implementing data or elaborating an evaluation framework, this section supports creators in understanding the legal background of developing and deploying their challenge, while also keeping in mind the needs of potential challenge participants. The ultimate goal is to be fully consistent with legal necessities while creating a challenge by working with the content of the legal section provided within the toolbox.


Video Answers:

  • Could you give a quick overview of the main legal issues with regard to semantic technologies? [video]
  • What should you bear in mind when trying to understand a linked open data license? [video]
  • Which Open Licenses are most common? [video]
  • What license would you recommend for licensing your prototype or app? [video]
  • How do you deal with personal data? [video]
  • Legal and IPR Q&A full text [text file]

Comments are closed.