Hochschule Offenburg | Germany | 77xxx Offenburg | Temporary contract | Part time - flexible / Full time | Published since: 29.01.2026 on stepstone.de
Academic Assistant - Deep Learning & Embedded AI in the research project Lab2Device
At Offenburg University, 4,000 students from 40 nations study. With four faculties, we offer a wide range of interdisciplinary and practical subjects: from business and economic psychology to mechatronics, media and medical technology to biotechnology and artificial intelligence. The Offenburg University of Applied Sciences is a place of innovation and is one of the high-research universities for applied sciences in Baden-Württemberg. We maintain intensive contacts with companies and partner institutions from the region and are at the same time strongly internationally oriented. More than 500 people are employed at Offenburg University. Offenburg University of Applied Sciences Academic staff*in in full time | 100 % | Compensation according to E 13 TV-L | limited to 3 1/2 years | for the research project Lab2Device. .
* After clicking the Read more button, the original advert will open on our partner's website, where you can see the details of this vacancy and contact information. If you need a translation of this text, after returning to our website it will be prepared and you can read it by clicking the Show full translation button.
Your tasks • Your profile • What we offer
Self-learning of research tasks in the Lab2Device project Independent publication of research results from the Lab2Device project Student guidance (e.g. Hiwis) Self-transfer of research results in working applications Modern deep learning models are powerful but resource-intensive. This prevents their local use on embedded systems and leads to inefficient, unsafe and data protection-critical cloud-based reference approaches. Model Compression (e.g. quantization, Pruning and Knowledge Distillation) and Neural Architecture Search (NAS) approaches allow the local use of deep learning models, but often lead to a reduction in the model performance and are associated with several other limitations (e.g., high computing complexity and thus high energy costs for neural architecture search approaches). The aim of this doctoral thesis is to research new Model Compression techniques that reduce the relevant costs for use on embedded systems (such as storage consumption, latency and energy consumption) with minimal loss of model performance. You work in an interdisciplinary team (three research institutes of the University of Offenburg are involved: IMLA, INES, ivESK) with another doctoral student (focus on neural architecture search) and two postdocs to enable and benchmark resource-efficient deep learning models for local inference on embedded hardware.
Completed scientific higher education (master or comparable) in the field of computer science, statistics, electrical engineering or information technology Experience: Detectable programming skills (Python, C/C++) Very good knowledge and practical experience with modern machine learning methods and frameworks (PyTorch and/or JAX) At least basic knowledge of embedded systems Basic German knowledge of advantage, good English knowledge meaningful
Personality: self-initiative and structured working Interest in creative problem solving Joy of interdisciplinary teamwork
attractive training courses that you can continue in your job a modernly equipped workplace Special payment flexible working hours Corporate Benefits Access the possibility of children's holiday care in the academy Kita Sommersprosse attractive offers in the context of occupational health management and the reconciliation of work, family and care
Location
![]() | Hochschule Offenburg | |
| 77652 Offenburg | ||
| Germany |
The text of this ad was translated from German into English using an automatic translation system and may contain semantic and lexical errors. Therefore, it should be used for introductory purposes only. For more detailed information, see the original text of the ad at the link below.
For more information read the original ad