Master’s Thesis: Brain-inspired Recurrent Neural Algorithms for Advanced Object Recognition

It’s done! I finished my Master’s Thesis which focused on the idea and implementation of recurrent neural networks in computer vision, inspired by findings in neuroscience. The two main applications of this technique shown here are the recognition of partially occluded objects and the integration of context cues.

Here’s the link: Brain-inspired Recurrent Neural Algorithms for Advanced Object Recognition – Martin Schrimpf

NIPS Brains&Bits Poster

Just presented our work on Recurrent Computations for Pattern Completion at the NIPS 2016 Brains & Bits Workshop!

Here’s the poster that I presented.

It was an awesome conference, lots of new work and amazing individuals.
Here’s a really short summary, but I highly recommend going through the papers and talks:

  • unsupervised learning and GANs are hot
  • learning to learn is becoming hot
  • new threshold for deep: 1202 layers

TensorFlow seminar paper on arXiv

After some requests, I have uploaded my (really short) analysis of Google’s TensorFlow to arXiv: https://arxiv.org/abs/1611.08903.

It is really just a small seminar paper, the main finding is that while using any Machine Learning framework is generally a good idea, TensorFlow has a really good chance of sticking around due to its already widespread usage within Google and research coupled with a growing community.

Hi there!

I’m Martin and I do research in Artificial Intelligence (including Machine Learning and Neuroscience). My focus is on Vision as well as the “basic building blocks” of intelligence which in my definition contains architecture and learning rules.
There are also some personal projects that I like to show here, all of which are centered around using technology to build something innovative and useful.

Scalable Database Concurrency Control using Transactional Memory

Although it’s been a while, I thought I’d upload my Bachelor’s Thesis for others to read: Scalable Database Concurrency Control using Transactional Memory.pdf.

The work consists of two parts:
Part 1 analyzes the constraints of Hardware Transactional Memory (HTM) and identifies data structures that profit most of this technique.
Part 2 attempts different implementations of HTM in MySQL’s InnoDB storage component and evaluates the results.