Hasso-Plattner-Institut
Prof. Dr. Holger Giese
  
 

Ask Your Repository!

An infrastructure to categorize and retrieve project knowledge by combining voice conversational interfaces over project knowledge-bases enhanced by hybrid crowd-machine learning classifiers.

Problem

When faced with a design problem, engineers make quick decisions about which tasks to perform. Many of these decisions are ad hoc or even look arbitrary. However, what engineers are actually doing is to apply creative thinking to relate the current situation to their tacit knowledge. In this sense, engineers use their experience on similar problems to come up with creative solutions.

This might involve looking at the data from the current project as well as data from previous projects. These data can consist both of artifacts (e.g., specifications, sketches, emails) and tasks (e.g., changes on artifacts, applied methods).

However, retrieving previous experiences and insights depends on one’s ability to recall data timely and in enough level of detail.

Except for certain types of data that are kept on versioning systems (source code and specifications), most project data are unstructured, e.g., lack categories or dependencies. Hence, this makes it difficult for engineers to accurately and unobtrusively retrieve most data.

Solution

Our goal is to create a project knowledge-base that can be easily and unobtrusively queried by using a built-in search or using voice commands.

This way the team can retrieve any relevant project artifacts quickly without having to know it's exact location in a deep folder hierarchy. Search queries can be filtered by the type of artifact, the date of creation as well as the respective process stage in which the artifact was created, involved users or by any relevant keyword describing this artifact.

To enable the user to intuitively search for artifacts he might only partially remember, we use machine learning to identify fitting tags and categories for each artifact.

Moreover, we will test the efficiency of relying on the intuition of a large group of people (crowdsourcing) to come up with data categories.

Our system proposes to use an interface that accepts voice commands and translates them into database queries by utilizing Google’s voice recognition API.

Technology

Our system employs the following technologies and methods:

  • Graph Databases (Neo4J) & Search Engines (Elasticsearch) to store and query the artifacts
  • Voice recognition and conversational interface (via the Google Dialogflow API)
  • Image / Text recognition to find relevant keywords by analyzing the text of the artifact
  • Machine learning to improve our tag suggestion
  • Crowdsourcing services (Amazon Mechanical Turk) to manually classify the artifacts into categories
  • Modern component based UI (React)
  • End-to-end test suite (Cypress) to ensure quality and functionality of the application at any time
  • Service oriented architecture to allow decoupled development
  • Cutting edge continous multi-stage deployment with Docker Swarm & Kubernetes

Project Partner

HPI School of Design Thinking

  • Dr. Claudia Nicolai
  • Sherif Osman
  • Stefanie Gerken

 

Project Team

From left to right; back row: Christian Zöllner, Arne Zerndt, Leonhard Hennicke, Jascha Beste, Erik Ziegler; front row: Christian Adriano, Adrian Steppat, Luise Benkert

Students

  • Luise Benkert
  • Jascha Beste
  • Leonhard Hennicke
  • Adrian Steppat
  • Arne Zerndt
  • Erik Ziegler

Supervisors

News & Events

  • 28.08.2018   Kickoff Meeting, Start of Research Phase
  • 01.11.2018   Research Seminar
  • 05.11.2018   First Planning Meeting
  • 13.12.2018   Initial Prototype Presentation with Project Partner
  • 25.02.2019   Vision Meeting
  • 28.02.2019   Intermediate Presentation with Project Partner
  • 04.03.2019   Beta Release on askyour.cloud
  • 06.03.2019   D-School Coaches Meeting
  • 07.03.2019 - 20.03.2019 Global Design Thinking Week user testing
  • 20.03.2019   Scientific Writing Workshop
  • from 15.04.2019   D-School Advanced Track user testing

Contact

If you have questions you can visit us in A-2.3 (Mondays and Thursdays) or contact our team members or supervisors.