Computer Engineering Laboratory

The Computer Engineering laboratory performs research in a broad range of topics ranging from computer arithmetic and computer architecture to compiler construction and focusing both on embedded systems as well as high performance computing without losing sight of future and emerging technologies.

Our research is focused on 5 domains with the following challenges:

  1. Quantum Computing: the CE lab is part of the QuTech research lab that aims to build a quantum gates based computer. We are looking at the different architectural design choices that depend on the underlying qubit technology, the encoding scheme chosen and the kind of logical qubit one wants to implement.
  2. Big Data Architectures: In the past couple of years, big data has become a well-known term used to refer to the exponential increase in the amount of data generated by all sorts of data generation and collection systems worldwide. The analysis of these large data sets has shown to provide valuable insights into a number of challenging problems in various fields, ranging from physics to medicine. However, carrying out such analysis methods continues to be hampered by the sheer size of the data sets on the one hand, and by the computational complexity of the used algorithms on the other. The CE lab performs research into HPC infrastructure to enable the efficient processing of these big data problems.
  3. Liquid Architectures: Due to the increased System-on-Chip (SoC) complexity where adaptivity, reconfigurability and composability are viewed as key system features, this line of research investigates how to make e.g. the processor architecture runtime adaptable to the application requirements and the available hardware resources. Topics are: workload characterization, hardware/software co-design, reconfigurable VLIW processor architectures (rVEX).
  4. Dependable Nano Computing: Driven by three major challenges: 1) technology scaling (causing extreme variability, reduced reliability, …), 2) globalization of IC supply chain (demanding re-assess of trust in hardware, IP protection, …) and 3) Internet of things (demanding secure End-to-End solutions, user privacy & data protection, …) , this research pillar focuses on three topics: 1) Reliability (including modelling, monitoring, mitigation, ….), 2) Testability (including Fault Modeling and Design-for-Testability for 3D stacked ICs and emerging memories), and 3) Hardware security (including PUF technology, secure design, etc).
  5. In-Memory Computing One of the most critical challenges for today’s and future data-intensive and big-data problems is data storage and analysis. The primary goal is to increase the understanding of processes by extracting highly useful values hidden in the huge volumes of data. The increase of the data size has already surpassed the capabilities of today’s computation architectures, which suffer from the limited bandwidth, energy inefficiency and limited scalability. In-Memory-Computing research targets the development, the design and the demonstration of a new architecture paradigm for big data problems; it is based on the integration of the storage and computation in the same physical location (using a crossbar topology) and the use of non-volatile resistive-switching technology, based on memristors, instead of CMOS technology. 

The Computer Engineering Laboratory contributes to several master programmes, namely the Computer Engineering Master and the Embedded Systems Master. In those master programs, we are mainly responsible for all computer architecture courses ranging from advanced multicore architectures, computer arithmetic to reconfigurable computing design. 

If you want an idea on the domain of Computer Engineering in general and on what we do in Delft in particular, have a look at this video.

CE Tweets