A Software Engineer passionate about developing and designing interactive applications. With the help of tools such as Unity and Unreal engine, I build entertaining products & prototypes. I am currently leading the development of interactive products for a mobile application development company.
I enjoy playing soccer and video games in my free time. I also, listen to a wide genre of music. Inspired from the music I listen to, I play around with audio production software and compose tunes. As a hobby, I collect musical instruments. Here you can know all about the projects that I have worked on in the recent past.
The Pittsburgh Steelers officially rolled out a new Augmented Reality (AR) experience at Heinz Field for home games and stadium tours. The official Steelers app lets fans interact with the Terrible Towel Wall in the stadium. Fans can view a video on the creation of the iconic towel or swipe through a gallery of fan favorites and their history.
The main objective of this project was to create a feature that would elevate the experience of the fans attending football games at Heinz Field, Pittsburgh. To bring this experience to life, we researched in and around the home turf. It soon led to us exploring the most popular spots that fans would visit in the stadium and the Terrible Towel wall was on top of our list. We then began working on some prototypes that would connect well with this physical space. Eventually, we settled on creating a virtual terrible towel experience that would allow fans to swipe through all the different types of towels and read more information about them.
In addition to the virtual towel experience, we also built another feature which would allow the visiting fans to receive a custom Steelers Nation Unite celebrations sticker. Each sticker was based on the fans game day “celebration” like a birthday or first game, and when photographed in a selfie will enable various filters. The fans could click the selfie and share it over social media.
The Wolverhampton Wanderers football club is unveiling their new jersey design and kit at the Molineux stadium. The team was looking for ways to engage guests at the venue through immersive technology combined with social media marketing. The team sent us the materials for a life-size poster of their players in the new kits that would be unveiled at the stadium. We proposed the idea of an AR experience that would allow people to scan the poster to see the players walk out of it and pose for a photograph with them. The experience was located by the souvenir shop where the new kits were on sale for all the fans to see.
The experience involved focusing on a life-sized poster of three football players with a mobile camera to have the players step out of the poster and into AR space and interact with the viewers. The aim of the experience was to engage the audience at the venue and use emerging technology to foster social media sharing and brand awareness.
We collaborated with the Wolves prior to the shoot for the poster to ensure that we would be working with footage of high precision and quality. We went over best practices for capturing reliable green screen footage that would be prepared for keying on-the-fly and also appropriately scaled for the space. This implementation of green screen video in AR would be a more nuanced version of Yinzcam’s previous products which involved handheld pop up videos or life-size billboard videos.
The footage was carefully matched with the player’s pose for the poster and overlaid to test for accuracy. We had to adhere to space constraints since the feature would be housed in the official Wolves mobile app, so videos would have to be carefully compressed without compromising quality. We also decided to forgo using HAP video with alpha channels and instead opted to use a custom Unity shader that would key out the green background in real-time. For this purpose, the videos were keyed manually and then re-exported with a flat bright green background that is optimal for live keying.
Launched during the NBA 2017 playoffs; Deep in the Q is a virtual basketball experience developed for the Cleveland Cavaliers. This app combines the classic “pop-a-shot” basketball game with the modern augmented reality technology. With the flick of a finger; fans with the app can show off their ball shooting skills. They can compete against other players in or outside the Quicken Loans Arena.
The development of the app kicked off with drawing simple sketches of how the game would look, in terms of perspective such as the placement of the ball and board. Also, keeping in mind that the whole experience had to be really simple and straightforward for the users since it was being built in a virtual space.
The key interaction involved pointing the phone at an image (which would overlay the basketball board) and shoot baskets. I began working on implementing the point and shoot mechanism as that was the main essence of the game. After a good amount of playtesting and tweaks, I arrived at a solution which I believed was both challenging and fun for the users. To increase the replayability factor of the game and to introduce a sense of competition among the users' leaderboards and achievements were added to the game.
To make the game more entertaining for the in-arena fans, social zones were set up which allowed the users to compete in one on one basketball shootout. The app also allowed the fans to compete as a group, the fans from one section in the stadium could play against the opposite sections while pointing their phones at the graphic displayed on the main scoreboard.
Only a privileged few get to raise the official 12th Man flag at Seattle’s CenturyLink Field and hear the roaring crowd chant SEA-HAWKS. Within the Seahawks official app, fans everywhere could now virtually get in on the ritual of raising football fandom’s most recognizable flag.
Our initial approach was to understand more about the flag raising ceremony and what it meant for the SeaHawks fans. How fans at home; watching the ceremony on television could experience the same excitement as the ones in the stadium? The idea was to bring the stadium experience to the user. The first prototype built, let the users point at a 12 flag logo and make the pole and flag appear. The flag then would automatically rise up.
After having tested the app; I noticed that this version did not let the users actively participate in raising the flag. Therefore, I decided to add a few interactions and tweaks. Now with a simple swipe, the vertical movement of the flag could be controlled. Also, included a feature of the SEA-HAWKS chant background volume would gradually increase as the flag moves up the pole. Once the flag reached the top a share button would appear; allowing users to snap a picture of the raised flag and share it on any social media website.
Boston Celtics launched an augmented reality shot chart feature for their fans during the playoffs of NBA 2017-2018 season. The feature allowed the in-arena fans to scan a QR code printed on the game program and visualize a particular game’s shot chart in 3d space.
The shot chart in the Celtics official app displays all the data in a 2d space. To present this to the fans in a new way a fair amount of researching had to be done. This led to the development of a feature which involved displaying all the shot chart data over a virtual court with each shot being represented as a tiny interactable basketball. Tapping on the basketball would open up the player info and other details specific to a particular game. Each shot data was tied to distinct QR codes that the users would scan and would ideally require them to collect these codes in order to review the stats for the game they visited earlier. In addition to the stats, the feature also provided an option to check a game’s highlights.
Let’s face it, painting your face to support your favorite team can be a pain. A one of its kind fan experience which lets fans overlay virtual face paint, helmets and other 3d models on the live image of themselves. Fans can then take selfies and share the photos on social media to show their support towards their favorite teams.
To make this experience successful, the project was built on top of a face detection library. The points generated after the detection of the user’s face were used to generate a mesh over which an image was applied as a texture. The development was done keeping in mind that the masks/virtual paint filters could be easily modified without an app update. Therefore all the image assets were designed and programmed to load from servers. This gave us the flexibility to deploy this for multiple teams with ease.
AFL clubs such as Hawkthorn and Sydney Swans utilized AR to unlock special contents for their members.
The development of the feature was pretty straightforward as we wanted to keep everything simple and sweet. The AR content for the Hawks was a green screen video file which had all the effects in it. For the Sydney Swans, we stuck to our regular video player. Both features were developed in a very short time frame.
This feature was developed for both Android and iOS platforms was integrated into the Hawks official app.
NFL team Carolina Panthers wanted to utilize AR to unlock player cards for their draft party event visitors. The user can scan and unlock cards which displayed a particular player's bio.
With a small time frame to develop the feature, the user flow and design were kept simple. Scanning player cards, which were printed and placed around the venue, a user could view player names, description, and a short video clip. A view was also designed where the users could see the collection of unlocked cards.
This feature was developed for both Android and iOS platforms was integrated into the Panthers official app. Although it was only made available(geofenced) to the fans attending the event.
Launched during an Eagles 5k run event, the AR experience allowed fans to unlock a special content on scanning the medal.
Designed and developed over a span of two weeks,the feature was made available to the fans attending the 5k run. Scanning the medal would unlock a special content in the form of a video message from the team owner.
This feature was developed for both Android and iOS platforms was integrated into the Eagles official app.
Wolverhampton Wanderers supporters can win weekly prizes by topping the leaderboard of AR football flick game made available at the Mander Center,UK
The game was built for the fans visiting the Mander Center, where they could use play a fun game of soccer by pointing their phones at a wall advertisement. Fans could top the leaderboard by scoring maximum points over a fix time period.
This feature was developed for both Android and iOS platforms and was integrated into the Wolves official app.
During the NFL 2017 season, Kansas City Chiefs launched "Coca-Cola Refreshing Moments in Chiefs History" AR feature. A campaign which invited fans to purchase a special edition Chiefs-branded Coca-Cola can. The can could be scanned by the built-in camera within the team's official mobile app; triggering an AR video board to appear above the can. Fans can then choose from a selection of historic plays from this video board. The virtual videos could be controlled by turning the physical can clockwise to rewind or counterclockwise to fast forward them.
The coke can acted as the starting point(trigger) for the experience and everything else had to be built around it. Development started with looking for different ways in which this experience could be made engaging and then decided to use the can as a tool for interaction with the virtual world. The can in the real world acted as a video player controller in the virtual world.
The video was enclosed inside an oval shaped 3d model which resembled the video board from the KC Arrowhead Stadium. This feature was developed for both Android and iOS platforms was integrated into the Chiefs official app.
This feature allows the Milwaukee Bucks fans to actively participate in the REV UP moment through the Bucks official team app. The Harley Davidson sponsored feature is designed to excite fans as a Harley-Davidson motorcycle is revved up and ridden inside of Milwaukee's Bradley Center.
The feature was pretty straightforward and was built within a short period. It involved a simple interaction. Tapping on the Rev Up button(as shown in the image) would trigger the sound of a Harley Davidson motorcycle revving up. When multiple fans collectively press the rev up button it would add up to the excitement.
An augmented reality basketball game launched for the Raptors during the NBA 2017-2018 season. This app combined augmented reality technology with the classic “pop-a-shot” basketball game and with the flick of a finger. Fans with the app can show off their basketball shooting skills by competing against the players in or outside the Scotiabank Arena.
The experience was similar to the Cavs Deep in the Q with couple of modifications. A timer was added to increase the competitiveness and the scoring system was also modified. A system to maintain an internal leaderboard along with the GooglePlay and iOS Game center leaderboard was also included.
The AFL Richmond FC wanted to give its fans the experience of raising the 2017 premiership flag. An augmented reality experience was created and included in the official AFL Richmond FC app.
This experience was built using the existing Seattle SeaHawks project with some minor changes to match the team’s theme and colors.
In the 2016 NBA season opener; players, coaches and staff members received championship rings in the main ceremony. Courtesy of Nike; fans attending the game received silicone bands from the Cavaliers. The bands activated an augmented reality experience through the team’s official app that simulates the visual of the wearing of the championship ring.
The experience would augment the championship ring around the user’s finger when they point their phones at it. To build this; along with the official sponsor Nike, we came up with an idea to create finger bands to be distributed among the fans in the stadium. After a couple of iterations we came up with a simple pattern that would be easily detectable by the camera and at the same time fit well on a tiny finger ring.
The 3d model of the championship ring that I received was very high in detail and had to be cut down to a low poly model for it run well on mobile devices. After gathering all the assets it was time to work on user experience flow which had the following interactions:
Our team "Akili" - Swahili for smart; developed a tablet trivia game for the high school students in the US and Tanzania belonging to the Opportunity Education Foundation. The main focus for the game was to work upon developing the children's self-assessment and life skills.
Akili conducted surveys and playtests with target audiences; in order to design and develop an experience that can be enjoyed globally. Introduced concepts of identities which related to a future career a student would want to pursue. As a game designer and programmer, I implemented the gameplay mechanics in single player trivia mode and multiplayer split-screen trivia mode. Also, programmed the integration with the backend for handling the quiz data.
Built an interactive installation along with teammates for the middle school students of Elizabeth Forward School District and Bethlehem Centre. This interactive game was used to help students to understand the science behind the exploration of fossil fuels in a creative way. With a strong focus on geology and seismic mapping, our team created an asymmetric game where students are faced with the challenge of mapping the earth underneath its surface and locating fossil fuels. Students have access to different sets of tools /information and must work together to solve their layer-mapping puzzle.
Built several games in a time frame of 2-3 weeks with a focus on creating exciting user experiences utilizing the available technology. As a gameplay programmer, game and sound designer created games for several platforms including Windows and PlayStation; using tools like Leap Motion, Oculus VR, Kinect, etc.
Rigged an electronic circuit to implement a sonar-based collision avoidance system and created a MATLAB simulation to implement a vision based collision avoidance system. The sole purpose of this experiment was to identify the feasibility of a MAV to achieve an autonomous flight. An 8 bit PIC microcontroller was programmed in Embedded C to drive the circuit.