About Page

Hi there! I'm Jay I am 23 years old and just graduated from Indiana University.

I have a Bacholer of Science in Intellegent Systems Engineering, with a concentration in Computer Engineering.

I have been programming with C & Python for 4+ years, and I have extensive experience with MATLAB and microcontrollers.

I have a solid understanding of widely used libraries and frameworks like NumPy, pandas, and scikit-learn.

Using pandas, I perform data manipulation, exploration, and feature engineering, while scikit-learn empowers me to build and train robust machine learning models.

I excel in implementing various machine learning algorithms, including linear regression, decision trees, random forests, and support vector machines.

My last internship with Reckitt provided me with proffesional data analytics and software engineering experince, using a company-wide product lifecycle mangement software.

Addtionally, I possess a background in other programming languages, embedded systems, and robotics/drones


Capstone Team 7

Capstone Team 7

One of the more signifigant projects that I worked on as an Engineer was my senior design capstone project.

Our prototype was called the EMG controlled exo-skeltal arm brace and was sponsored by VispalExo.

An arduino microcontroller controls the logic of the brace using a support vector machine learning algorithim.

The SVM classifier is able to read the emg signal and determine whether the user is flexing or extending the attatched muscle.

VispalExo Website
Picture of brace hardware, I personally designed & assembled this prototype myself

hardware assembly of Exo-skeleton brace

First Demo of the exo-skeleton arm from first semester

Working implementation of EMG controlled brace (EMG sensor on left arm)

Undergraduate research

Trusted Ai Symposium Photo
Photo from the trusted Ai symposium at Indiana University Center for artificial intellgence

During my time in the lab, I primarily worked on two studies. The first one I worked on

was called Trusted Ai or Trust In Ai (depending on who you ask.) This project is funded by NSA Crane and is still active in the socio-physio lab.

This study involved one outside subject and one lab member. The subject and the lab member were instructed to either

play cooperativley to achieve a high score or play against each other to achieve a high score. The cool part about this game

is that if you don't work together, you will almost always lose more money than you would working in tandem. The game consisted

of two haptic controllers in seperate rooms. Each haptic device controlled its own agent, but each of the two agents were tethered together.

The two players were also provided with a camera to visually comunicate with the other player, but they were physically prevented from

speaking to each other verbally. Once the game started the participants were shown the two agents and the tethter connecting them.

When one player moved their haptic controller up, the other player would feel the pull from their counterpart on their hand.

To make things even spicier, the gates that were shown to each player were positioned similar to flappy bird (except there are two gates you could pass through).

One gate would have a net income of money while the other would have a net loss. Failing to pass through either of the gates would have a radical impact on your money.

So players were forced to play, knowing that the "opponent" would see a chance to gain more money or chose to cooperate and potenially lose money.

However if each player were to pull to the opposing gate, they would both lose a ton of money (or all of it).

This study attempted to evaluate the trust between people and an autonomous agent. The study had some interesting findings; one subject even got so mad they got up and left the lab!

During the spring semester, we presented some of the findings to the Department of Defense. We were able to show them the protocol and some of the visitors even played our game.

The main contribution I made towards this project was with the haptic controller code, bug fixing the code for the "hapty-bird" game,

running subjects through the protocol, and gathering/visualizing data collected on subjects to analyze results and findings.

Trusting Ai is sponsored by NSA Crane and is still currently accepting subjects. Please see these article's to learn more about Trusted Ai.

  • Trusted AI Article 1
  • Trusted Ai Article 2
  • Autisim Movement Study

    During my research journey, I dedicated a significant amount of time to the Autism Movement study in collaboration with researchers from IU Kokomo.

    This study, conducted at the Socio Neural lab, focuses on understanding the movements and interpretations of individuals across the broad autism spectrum disorder.

    Within this project, we developed two main protocols aimed at investigating the emotional movements of participants.

    The first protocol involved utilizing a Kinect camera to track the movements of participants, capturing point line data.

    Participants were instructed to perform a series of "emotional" movements. These were classified by raters to determine the intensity of the emotion expressed.

    By analyzing the point line data using Python along with the SciPy library, we assessed the severity of movement in comparison to baseline

    actors, enabling us to gain insights into how individuals with autism spectrum disorder differ in their motor behavior.

    With the help of Python's image processing capabilities and the SciPy library, we processed the recorded video footage

    and extracted the coordinates of the body points for each frame.

    This involved utilizing computer vision algorithms to detect and track specific body joints or markers.

    Once the point line data was extracted and processed, we transformed it into a suitable format using the pandas library

    to export the point line data to Excel for further analysis and visualization.

    The second protocol involved participants sitting in front of a computer screen and following instructions provided in a video. They were

    shown two videos: the first one presented instructions, guiding the subjects to mirror emotions displayed screen as closely as possible.

    The second video consisted of a collection of baseline emotional responses.

    In this stage of the study, I played a key role in developing the protocol for the mirroring faces exercise.

    Additionally, I was responsible for processing the collected data from OpenFace and loading it into MATLAB for further analysis.

    Utilizing data processing channels such as OpenFace and OpenCV, we were able to analyze facial affect data and identify

    significant differences between individuals with autism spectrum disorder and neurotypical subjects.

    By employing data analytics techniques, we mass-processed the data and extracted relevant features

    including the facial expressions that differed the most from the baseline.

    This groundbreaking research aims to quantify movement patterns as "Autistic" or "Neuro-Typical."

    It is important to note that the study is still in development, and the findings are yet to be published.

    Currently, our lab is actively seeking grant funding to further support this vital research in the field of mental health.

  • "Autisim Movement Study article"
  • Engineering projects