OSQ

Showcase

WHAT IS OSQ?

Open Source Quest is a structured, month-long open-source contribution program designed to help students gain real-world experience with professional software development workflows.

participants work individually on curated, domain-specific github repositories under guided mentorship. the program focuses on understanding how real open-source projects function—through issues, pull requests, reviews, and consistent contributions—rather than competitive coding or short-term hackathons.

the initiative is beginner-friendly and inclusive, welcoming students from all branches and academic years. contributions are not limited to writing code alone; participants may also work on documentation, testing, research, and project maintenance, reflecting how modern software teams operate.

by combining mentorship, structured tasks, and transparent progress tracking, open source quest aims to transform learners into confident, responsible open-source contributors while fostering a sustainable technical culture within the campus.

MEET OUR JUDGE

Praneetha Kotla is a Lead Robotic Process Automation (RPA) Developer with over 11 years of industry experience across healthcare, pharmaceuticals, insurance, and enterprise systems.

Currently working with Johnson & Johnson in the United States, she brings strong expertise in automation, data engineering, and enterprise technology, along with experience as a speaker, mentor, and published researcher.

  • 11+ years of industry experience in Robotic Process Automation (RPA), Data Engineering, ETL, and Business Intelligence
  • Lead RPA Developer at Johnson & Johnson (USA), working on large-scale enterprise automation solutions
  • Previous roles at Corewell Health, Medline Industries, USAA, and BMW India
  • Experience across healthcare, pharmaceuticals, insurance, telecom, and enterprise systems
  • Strong background in automation, data integration, and scalable application development
Praneetha Kotla

Praneetha Kotla

Lead RPA Developer

ERP Smartlabs

Connect on LinkedIn

EVALUATION &
GIT ENGINE SCORING

The evaluation in Open Source Quest (OSQ) is designed to recognize meaningful contributions, consistency, and learning impact, rather than raw activity counts. Judging combines automated GitHub-based metrics with human qualitative assessment.

Git Engine – Automated Metrics

The following parameters are automatically calculated by the OSQ Git Engine system:

  • Pull Requests Merged & Opened
  • Issues Created and Closed
  • Commit Activity
  • Code Reviews and Review Participation
  • Contribution Consistency Over Time
  • Repository-wise Contribution Distribution

These metrics together form the Base Score for each participant.

Repository-wise Judge Score

In addition to automated metrics, judges assign a Repository Quality Score ranging from 1 to 10, based on:

  • Overall quality of contributions
  • Impact and relevance of work
  • Code clarity and professionalism

This score is applied repository-wise and proportionally affects the scores of contributors working on that repository.

Final Score Calculation

The final score for each participant is calculated as:

FinalScore = 0.7 × BaseScore + 0.3 × NormalizedJudgeScore

Leaderboard Updates

The weekly leaderboard is updated based on the Final Score.

  • Automated metrics ensure transparency and consistency
  • Judge scores ensure quality and real-world impact are fairly represented

Track progress. Learn consistently. Earn recognition.

TIMELINE

1 Feb

OSQ Kick Starts

8 Feb

1st Leaderboard Update

15 Feb

2nd Leaderboard Update

15 Feb

Last Date for Registration

22 Feb

3rd Leaderboard Update

28 FEB

OSQ Ends

5 MAR

Winner Announcement

GITHUB MERGE ISSUES FORK COMMIT PUSH GITHUB MERGE ISSUES FORK COMMIT PUSH GITHUB MERGE ISSUES FORK COMMIT PUSH GITHUB MERGE ISSUES FORK COMMIT PUSH
CODING BUILD DEPLOY DEBUG CREATE LEARN CODING BUILD DEPLOY DEBUG CREATE LEARN CODING BUILD DEPLOY DEBUG CREATE LEARN CODING BUILD DEPLOY DEBUG CREATE LEARN

CONTRIBUTE MORE.
EARN REWARDS.
SHINE ON THE LEADERBOARD.

Open Source Quest rewards consistent effort, quality contributions, and responsible collaboration.

As you contribute to repositories, your activity is tracked through GitHub-based signals such as pull requests, issue resolution, and contribution consistency. Progress is reflected on the live leaderboard, offering transparent insights into repository activity and individual performance.

Weekly highlights recognize contributors who demonstrate:

  • Meaningful and well-structured contributions
  • Consistency over time
  • Collaboration and review participation

The leaderboard is designed to motivate learning, not competition. Final recognitions are mentor-reviewed to ensure fairness and quality.

DON'T KNOW GITHUB?

Learn it today and take part in the competition.