Student Thesis Talks - Part I

Parker Finch – Static Coalescing of Race Checks
Much of our computing infrastructure utilizes multicore processors and multiprocessor hardware. Such systems can concurrently execute many software threads of control to improve responsiveness and performance, but the potential for unintentional interference between concurrent threads makes it difficult to ensure the reliability of multithreaded software. Automated tools to identify concurrency errors, such as race conditions, have the potential to make this task easier.

A race condition consists of two threads accessing (and at least one modifying) the same piece of data at the same time. While dynamic data race detection algorithms have improved in recent years,the overhead of race detection in array-intensive programs remains prohibitive. One promising insight is that arrays are often accessed via common patterns that enable compression of the checks performed and information maintained by a dynamic race detector. However, a purely dynamic implementation of this compression technique failed to realize a corresponding decrease in run time due to the cost of inferring those access patterns at run time. We explore statically annotating programs with the patterns that appear during run time to eliminate the need to infer them. Preliminary results show a significant speedup in array-intensive programs.

Joshua Geller – Black Box or Composite: Approaching Multi-Class Feature Selection
In classifier learning, some algorithms, like support vector machines (SVMs), for multi-class classification use a series of binary classifiers to make their classifications. Feature selection involves identifying and removing the least useful features in order to make the classification task more efficient or effective by identifying an “optimal” subset of features. There are three major ways of thinking about feature selection during classifier learning. One of these — the wrapper method — is a general class of algorithms that can be applied to any underlying classifier, but the choice of the classifier learner impacts the features selected. An interesting issue arises when the underlying classifier learner is fundamentally designed for two-class problems, as is the case for support vector machines. Current approaches operate only on the “black-box” level, using one uniform feature set for each underlying binary classifier. I propose instead a “composite” approach of performing feature selection on each of the underlying classifiers individually. I furthermore investigate a number of other questions pertaining to feature selection for multi-class problems.


Parker Finch Parker Finch ’14

 

 

 

 

Joshua Geller Joshua Geller ’14