A software engineer with experience in developing interactive applications specifically in the field of AR/VR and game development. With the help of tools such as Unity3d and Unreal engine, I build interesting products/prototypes. I am currently leading the design and development of augmented reality products for a mobile application development company.
I do play soccer and video games in my free time. I also, listen to a wide genre of music. Inspired from the music I listen to, I play around with audio production softwares and compose tunes. As a hobby I collect musical instruments. Here you can know all about my projects that I have worked on in the recent past.
Launched during the NBA 2017 playoffs; Deep in the Q is a virtual basketball experience developed for the Cleveland Cavaliers. This app combines the classic “pop-a-shot” basketball game with the modern augmented reality technology. With the flick of a finger; fans with the app can show off their ball shooting skills. They can compete against other players in or outside the Quicken Loans arena.
The development of the app kicked off with drawing simple sketches of how the game would look,in terms of perspective such as the placement of the ball and board; also keeping in mind that the whole experience needs to be really simple and straightforward for the users, as it was being built in a virtual space.
The key interaction involved pointing the phone at an image (which would overlay the basketball board) and shoot baskets. I began working on implementing the point and shoot mechanism as that was the main essence of the game. After a good amount of playtesting and tweaks, I arrived at a solution which I believed was both challenging and fun for the users. To increase the replayability factor of the game and to introduce a sense of competition among the users we decided to add leaderboards and achievements to the game.
To make the game more fun for the in-arena fans, there were social zones setup which allowed the users to compete in a 1 vs 1 basketball shootout. The app also allowed the fans to compete as a group, the fans from one section in the stadium could play against the opposite sections while pointing their phones at the graphic displayed on the main scoreboard.
Only a privileged few get to raise the official 12th Man flag at Seattle’s CenturyLink Field and hear the roaring crowd chant SEA-HAWKS. Within the SeaHawks official app, fans everywhere could now virtually get in on the ritual of raising football fandom’s most recognizable flag.
Our initial approach was to understand more about the flag raising ceremony and what it meant for the SeaHawks fans. How fans at home; watching the ceremony on television could experience the same excitement as the ones in the stadium? The idea was to bring the stadium experience to the user. The first prototype built, let the users point at a 12 flag logo and make the pole and flag appear. The flag then would automatically rise up.
After having tested the app; I noticed that this version did not let the users actively participate in raising the flag. Therefore, I decided to add few interactions and tweaks. Now with a simple swipe, the vertical movement of the flag could be controlled. Also, included feature of the “SeaHawks chant” background volume would gradually increases as the flag moves up the pole. Once the flag reached the top a share button would appear; allowing users to snap a picture of the raised flag and share it on any social media website.
Boston Celtics launched an augmented reality shot chart feature for their fans during the playoffs of NBA 2017-2018 season. The feature allowed the in arena fans to scan a QR code printed on the game program and visualize a particular game’s shot chart in 3d space.
The shot chart in the Celtics official app displays all the data in a 2d space. To present this to the fans in a new way a fair amount of researching had to be done. This led to the development of a feature which involved displaying all the shot chart data over a virtual court with each shot being represented as a tiny interactable basketball. Tapping on the basketball would open up the player info and other details specific to a particular game. Each shot data was tied to distinct QR codes that the users would scan and would ideally require them to collect these codes in order to review the stats for the game they visited earlier. In addition to the stats the feature also provided an option to check a game’s highlights.
Let’s face it, painting your face to support your favorite team can be a pain. A one of its kind fan experience which lets fans overlay virtual face paint, helmets and other 3d models on live image of themselves. Fans can then take selfies and share the photos on social media to show their support towards their favorite teams.
To make this experience successful, the project was built on top of a face detection library. The points generated after the detection of user’s face was used to generate a mesh over which an image was applied as a texture. The development was done keeping in mind that the masks/virtual paint filters could be easily modified without an app update. Therefore all the image assets were designed and programmed to load from the servers. This gave us the flexibility to deploy this for multiple teams with ease.
During the NFL 2017 season, Kansas City Chiefs launched "Coca-Cola Refreshing Moments in Chiefs History" augmented reality feature. A campaign which invited fans to purchase a special edition Chiefs-branded Coca-Cola can. The can could be scanned by the built-in camera within the team's official mobile app; triggering an AR video board to appear above the can. Fans can then choose from a selection of historic plays from this video board. The virtual videos could be controlled by turning the physical can clockwise to rewind, or counter clockwise to fast forward them.
The coke can acted as the starting point(trigger) for the experience and everything else had to be built around it. Development started with looking for different ways in which this experience could be made engaging and then decided to use the can as a tool for interaction with the virtual world. The can in the real world acted as a video player controller in the virtual world.
The video was enclosed inside an oval shaped 3d model which resembled the video board from the KC Arrowhead Stadium. This feature was developed for both Android and iOS platforms was integrated into the Chiefs official app.
This feature allows the Milwaukee Bucks fans to actively participate in the REV UP moment through the Bucks official team app. The Harley Davidson sponsored feature is designed to excite fans as a Harley-Davidson motorcycle is revved up and ridden inside of Milwaukee's Bradley Center.
The feature was pretty straightforward and was built within a short period. It involved a simple interaction. Tapping on the Rev Up button(as shown in the image) would trigger the sound of a Harley Davidson motorcycle revving up. When multiple fans collectively press the rev up button it would add up to the excitement.
An augmented reality basketball game, launched for the Raptors during the NBA 2017-2018 season. Like the This app combines augmented reality technology with the classic “pop-a-shot” basketball game and with the flick of a finger. Fans with the app can show off their basketball shooting skills by competing against the players in or outside the Scotiabank Arena.
The experience was similar to the Cavs Deep in the Q with couple of modifications. A timer was added to increase the competitiveness and the scoring system was also modified. A system to maintain an internal leaderboard along with the GooglePlay games/iOS game center leaderboard.
The AFL Richmond FC wanted to give its fans the experience of raising the 2017 premiership flag. An augmented reality experience was created and included in the official AFL Richmond FC app.
This experience was built using the existing Seattle SeaHawks project with some minor changes to match the team’s theme and colors.
In the 2016 NBA season opener; players, coaches and staff members received championship rings in the main ceremony. Courtesy of Nike; fans attending the game received silicone bands from the Cavaliers. The bands activated an augmented reality experience through the team’s official app that simulates the visual of the wearing of the championship ring.
The experience would augment the championship ring around the user’s finger when they point their phones at it. To build this; along with the official sponsor Nike, we came up with an idea to create finger bands to be distributed among the fans in the stadium. After couple of iterations we came up with a simple pattern that would be easily detectable by the camera and at the same time fit well on a tiny finger ring.
The 3d model of the championship ring that I received was very high in detail and had to be cut down to a low poly model for it run well on mobile devices. After gathering all the assets it was time to work on user experience flow which had the following interactions:
Our team "Akili" - Swahili for smart; developed a tablet trivia game for the high school students in the US and Tanzania belonging to the Opportunity Education Foundation. The main focus for the game was to work upon developing the children's self-assessment and life skills.
Akili conducted surveys and playtests with target audiences; in order to design and develop an experience that can be enjoyed globally. Introduced concepts of identities which related to a future career a student would want to pursue. As a game designer and programmer I implemented the gameplay mechanics in single player trivia mode and multiplayer splitscreen trivia mode. Also, programmed the integration with the backend for handling the quiz data.
Built an interactive installation along with teammates for the middle school students of Elizabeth Forward School District and Bethlehem Centre. This interactive game was used to help students to understand the science behind exploration of fossil fuels in a creative way. With a strong focus on geology and seismic mapping, our team created an asymmetric game where students are faced with the challenge of mapping the earth underneath its surface and locating fossil fuels. Students have access to different sets of tools /information and must work together to solve their layer-mapping puzzle.
Built several games within a time frame of 2-3 weeks with a focus on creating exciting user experiences utilizing the available technology. As a gameplay programmer, game and sound designer created games for several platforms including Windows and Playstation; using tools like Leap motion, Oculus VR, Kinect etc.
Rigged an electronic circuit to implement a sonar based collision avoidance system and created a MATLAB simulation to implement a vision based collision avoidance system. The sole purpose of this experiment was to identify the feasibility of a MAV to achieve an autonomous flight. An 8 bit PIC micro controller was programmed in Embedded C to drive the circuit.