Are you applying to the internship?
Job Description
About LiveRamp:
LiveRamp is a leading data connectivity platform with a mission to make it safe and easy for businesses to use data effectively. They believe connected data can change the world, powering insights and experiences centered around the needs of real people while keeping the internet open for all. LiveRamp prioritizes building a collaborative environment where employees thrive on curiosity, humility, and having fun. They actively seek smart, kind, and creative individuals to join their team. The company operates globally, with a focus on diversity, inclusion, and belonging.
Internship Job Description (Full-Stack Application Development & Big Data Back-End Engineering):
LiveRamp offers a 12-week internship program designed to provide a comprehensive learning experience. Interns receive the same orientation training as full-time engineers, are paired with individual mentors, and are expected to commit production code within their first week. Interns will work on mission-critical projects, making meaningful contributions to the company’s success. The possibility of a full-time offer after graduation is considered at the internship’s conclusion. The internship focuses on full-stack application development and big data back-end engineering.
Co-op Job Description (Data Science in Ad-Tech):
This co-op opportunity is with LiveRamp’s Insights Engineering team and focuses on applying data science to the ad-tech space. Responsibilities include:
• Utilizing statistical and ML-based techniques to reduce selection bias, build representative samples, and analyze randomized studies.
• Collaborating with data science and engineering teams to design and implement data models measuring ad performance across large datasets.
• Conducting R&D by prototyping new statistical and data modeling frameworks, translating prototypes into SQL and PySpark workflows, and deploying them to production.
• Working across cloud environments (AWS and GCP) to build, monitor, and troubleshoot automated modeling jobs using technologies like Spark and BigQuery.
• Troubleshooting internal and external models by analyzing their components and underlying assumptions.
Required Skills for Co-op (Data Science):
• Strong mathematics, statistics, and computer science skills.
• Pursuit of a degree in a Data Science-related field (e.g., Mathematics, Statistics, Computer Science, Physics, or other STEM degrees).
• Enthusiasm for learning new data science techniques and technologies.
• Willingness to learn through practical experience.
• Ability to collaborate effectively to solve complex problems.
• A “startup personality”: smart, ethical, friendly, hardworking, and proactive.
• Interest in working on products with direct customer interaction.
Preferred Skills for Co-op (Data Science):
• Strong background in probability and statistics, experience implementing statistical models in SQL, and familiarity with randomized control trials.
• Experience contributing to a mature Python-centric code base.
• Proficiency in common data science toolkits (Pandas, NumPy, Matplotlib, sci-kit-learn).
• Experience writing SQL and working with relational databases.
• Understanding of software engineering fundamentals (object-oriented programming, computational complexity, version control with Git, command-line interfaces like bash).
• Knowledge of Big Data stacks (Hadoop and Spark).
• Experience with AWS and/or GCP cloud providers.
• Experience with Agile software development.