Data Integration Team Lead (0600U) 24857
At the University of California, Berkeley, we are committed to creating a community that fosters equity of experience and opportunity, and ensures that students, faculty, and staff of all backgrounds feel safe, welcome and included. Our culture of openness, freedom and belonging make it a special place for students, faculty and staff.
The University of California, Berkeley, is one of the world's leading institutions of higher education, distinguished by its combination of internationally recognized academic and research excellence; the transformative opportunity it provides to a large and diverse student body; its public mission and commitment to equity and social justice; and its roots in the California experience, animated by such values as innovation, questioning the status quo, and respect for the environment and nature. Since its founding in 1868, Berkeley has fueled a perpetual renaissance, generating unparalleled intellectual, economic and social value in California, the United States and the world.
We are looking for equity-minded applicants who represent the full diversity of California and who demonstrate a sensitivity to and understanding of the diverse academic, socioeconomic, cultural, disability, gender identity, sexual orientation, and ethnic backgrounds present in our community. When you join the team at Berkeley, you can expect to be part of an inclusive, innovative and equity-focused community that approaches higher education as a matter of social justice that requires broad collaboration among faculty, staff, students and community partners. In deciding whether to apply for a position at Berkeley, you are strongly encouraged to consider whether your values align with our https://strategicplan.berkeley.edu/guiding-values-and-principles/, our https://diversity.berkeley.edu/principles-community, and https://strategicplan.berkeley.edu/https://strategicplan.berkeley.edu
The Office of the CIO and Information Services & Technology (OCIO/IST) believe in and foster a workplace environment where people can bring their diverse skills, perspectives and experiences toward achieving our goals through a process of critical inquiry, discovery, innovation, while simultaneously committing to making positive contributions towards the betterment of our world.
In addition, members of the OCIO/IST community have created and endorse the following values for our organization to augment and amplify the campus principles:
We champion diversity.
We act with integrity.
Diversity, Inclusion, and Belonging are more than just suggestions for us. They are the guiding principles underlying how we come together, develop leaders at all levels of the organization, and create an environment that unites us. We affirm the dignity of all individuals, call upon our leaders to address critical issues with integrity and intention, respect our differences as well as our commonalities, and strive to uphold a just community free from discrimination and hate.
UC Berkeley's Enterprise Data & Analytics seeks a technical lead with supervisory responsibilities for a team of 4 data engineers/analysts charged with integrating data from multiple campus data sources in support of University decision-making, reporting and analytics. The team supports a traditional star schema data warehouse built using Informatica as the ETL tool and an Oracle RDBMS as the data store. The team is also learning to support a data lake built using a variety of AWS technologies, including S3, Lambda, Glue, Athena, Redshift, etc. An ideal candidate will have solid Informatica Powercenter experience and some familiarity with big data AWS technologies. As the Data Integration team lead, the candidate should have solid skills in people management, operations, customer relations, code development/deployment, testing/quality assurance, process discipline, metadata management and documentation.
- Design. Acts as a lead analyst in the design of new architectural components, developing documentation and translating functional requirements into technical specifications for data integration/engineering projects. Reconciles development activity to an architecture roadmap. Plans and conducts data analysis efforts, facilitates design sessions, and collaborates with business intelligence developers, architects, technical staff, and key business sponsors to identify and recommend solutions. Initiates or updates designs for dimensional models supporting data warehouse functionality. Coordinates reviews of data models, ETL (Extract, Transform and Load) processes, and data definitions.
- Development. Builds, unit tests, and deploys ETL processes. Working with the data architect, modifies data warehouse data structures to support data integration initiatives. Working with the data architect and subject matter experts, designs and develops the architecture for new Data Warehousing components across subject areas and technical domains (e.g. tool integration strategy; data source ETL strategy, data staging, movement and aggregation; information and analytics delivery; and data quality strategy). Leverages and coordinates the efforts of technical team members and architects.
- Data Engineering. Creates and maintains optimal data pipeline architecture. Assembles large, complex data sets that meet functional/non-functional business requirements. Identifies, designs, and implements internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.. Builds the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Informatica, SQL, Python and AWS ‘big data' technologies.
- Standards. Establishes and maintains best practices for the design, development and support of data integration solutions, including documentation; metadata creation, use and maintenance; and data naming standards
- Support. Oversees and continually improves production operations, including code deployments, bug fixes and overall release management and coordination. Ensures resilient operations stability via disaster recovery and business continuity planning and documentation. Coordinates with database, network and platform services teams on infrastructure capacity planning and implementation. Manages on-call production support for critical ETL processes on a rotation basis. Monitors and takes appropriate action to maintain integrity of the data warehouse/data lake, including security and data quality issues. Responds to and resolves operational problems as necessary. Documents, troubleshoots and resolves day to day data issues. Works with DBA's, system administrators, and other IST support staff to complete scheduled hardware and software maintenance activities. Works with data security officers and auditors to ensure quality assurance programs are compliant with corporate standards. Responds to customer concerns and potential issues. Devises strategies to ensure the stability and availability of the data warehouse/data lake environment.
- Team Leadership. Supervises and provides technical leadership to a team of data engineers/analysts working on data projects with campus-wide scope and impact. Responsible for results in terms of costs, methods, and employees. Plans and reviews work of data engineers/analysts to measure meeting of objectives. Provides mentoring and feedback on employee performance.
- 5+ years of Data Warehouse / Data Engineering experience. Advanced knowledge and deep experience in the design of Data Warehouse and Business Intelligence architectures, data integration solutions, and data services specifications.
- Advanced knowledge of data warehousing/business intelligence best practices, standards, and architectures.
- Expert knowledge of data management systems, data administration practices and standards.
- Adept in the collection, management and conversion of raw data into usable information for analysts and data scientists.
- Expertise in managing and creating mappings and workflows with Informatica PowerCenter
- Experience building and optimizing (using Python or other programming languages) “big data” pipelines, architectures and data sets.
- Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
- Strong analytic skills related to working with a wide variety of source data systems, both structured and unstructured.
- A successful history of manipulating, processing and extracting value from large disconnected datasets.
- Working knowledge of message queuing, stream processing, and highly scalable ‘big data' data stores.
- Must have expert knowledge relating to logical data design, data warehouse design, data integrations, as well as the management of web content or other unstructured data.
- Knowledge of data modeling tools and a sound understanding of dimensional data modeling techniques and principles.
- Ability to represent relevant information requirements from real-world processes in abstract models. Should be able to understand and model complex knowledge-intensive processes.
- Must be familiar with data model patterns in several common business or academic domains.
- Strong documentation skills for communicating specifications, processes and metadata to both technical and non-technical audiences
- Strong analytical and design skills, including the ability to abstract information requirements from real-world processes to understand information flows in computer systems.
- Proficient with SQL, including advanced analytical functions
- The ability to assess business requirements, develop technical specifications, and design appropriate Data Warehouse/BI solutions.
- Should be knowledgeable about data quality and governance issues and requirements.
- Experience with metadata management and tools.
- Ability to develop and maintain good working relationships with internal and external groups.
- Excellent interviewing and listening skills.
- Good negotiation and influencing skills.
- Excellent written and oral communication skills.
- Strong facilitation, organization and problem solving skills.
- Welcomes differing skills, outlooks, and experiences of others working toward shared goals.
- Experience with big data tools: Hadoop, Spark, Presto, Hive etc.
- Experience with data pipeline and workflow management tools: Airflow, etc.
- Experience with AWS cloud services: S3, Lambda, Glue, Redshift, Cloud watch, SQS, EMR, EC2, RDS, Step functions, etc.
- Experience with stream-processing systems: Kinesis, Kafka and Spark-Streaming, etc.
- Experience with object-oriented/object function scripting languages: Python, Java, C++, Scala, etc.
- Experience with Oracle/PeopleSoft Financials, HCM and/or Campus Solutions source ERPs
- Experience with Master Data Management and communicating its value
Salary & Benefits
Salary commensurate with experience. For information on the comprehensive benefits package offered by the University visit:
The minimum posting duration of this position is 14 calendar days. The department will not initiate the application review process prior to 10/18/2021.
Conviction History Background
This is a designated position requiring fingerprinting and a background check due to the nature of the job responsibilities. Berkeley does hire people with conviction histories and reviews information received in the context of the job responsibilities. The University reserves the right to make employment contingent upon successful completion of the background check.
Equal Employment Opportunity
The University of California is an Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status. For more information about your rights as an applicant see:
For the complete University of California nondiscrimination and affirmative action policy see:
To apply, visit https://apptrkr.com/2594349
Copyright ©2021 Jobelephant.com Inc. All rights reserved.