Careers

Informatics Data Analyst

Job ID# 10261 – Posted 6/06/2023 – Remote, Downey CA

Position Description

An Informatics Data Analyst works independently to document external data acquisition policies and procedures as well as interface with other business units to define and document data needs and ad-hoc analysis requirements; identify business practice workflow and process issues and concerns; provide leadership and guidance to define system and process requirements that will optimize system performance and technology utilization by system users; and design information systems and technologies that ensure access and transparency. The Informatics Data Analyst will develop, implement, organize, and maintain information and reports to document operational and financial performance; collaborate with the quality team, administrators, etc., to design the ad hoc and other relevant routine reports; monitor the quality programs and make efforts to improve them; compile data from multiple sources and import the data into the relevant database; select appropriate tools and methods to maintain the existing programs that ensures there is appropriate transfer of data and that all the reports are documented well to increase the efficiency; assist with analysis of external data definitions documentation and mapping process; perform a Quality Assurance function for the data integration processes and repository metrics; collaborate with other staff to document data needs and metric definitions; validate data load processes and the quality of the data loaded into data repositories; maintain, store, map and analyze data in compliance with policies and procedures; coordinate with LAN and IT Security teams in order to utilize PGP encryption software and File Transfer Protocols; and use software such as SAS, Access, and Excel to provide data mapping for integration using Business Objects, and SQL.

Skills Required

The Informatics Data Analyst will possess knowledge and experience in basic statistical concepts; identifying the trends, business opportunities, and relevant issues; commonly-used demographics and databases; definitions of fields and how data is entered and processed; data mining/intelligence; statistically-valid analysis and measurement methodologies; packaging data – infographics and other data display techniques, and report design options; data interpretation and validation; Ad Hoc analysis/performance evaluation principles; common demographic data bases; SQL coding and report services; data warehouse design; web-based application design; PC skills (e.g. Microsoft Office, Word, Excel, PowerPoint, SAS, Access and Excel); working independently as well as a leadership role in creating process documentation and interfacing with other team members to obtain this information; detailed-oriented; excellent interpersonal, oral and written communication skills; ability to handle multiple tasks; and problem solving and troubleshooting skills.

Skills Preferred

1. Understand the architecture, data warehousing concepts, and components like SQL Pools, Apache Spark pools, and Data Flow.
2. Strong knowledge of SQL for data extraction, transformation, and loading (ETL) tasks, including complex querying, stored procedures, views, and indexing.
3. Proficiency in designing and implementing efficient data models for data warehousing, including dimensional modeling techniques such as star and snowflake schemas.
4. Familiarity with programming languages like Python or Scala for data processing, analytics, and automation tasks in Azure Synapse Analytics.
5. Understanding of various data integration methods, including data ingestion from different sources, data replication, and data synchronization techniques.
6. Ability to optimize query performance and data processing in Azure Synapse Analytics through techniques like partitioning, indexing, and query optimization.

Experience Required

This classification must have a minimum of two (2) years of experience in processing, management, and retrieval of information.

Experience Preferred

1. Having 2 to 4 years experience in working with Microsoft Azure Cloud platform, including familiarity with other Azure services like Azure Data Lake Storage, Azure Databricks, Azure Data Factory, and Azure DevOps.
2. Having 2 to 4 years experience in designing and developing data warehouses using other platforms like Microsoft SQL Server, Oracle, or Teradata can be valuable in understanding data warehousing concepts and best practices
3. Having 2 to 4 years experience in big data technologies such as Apache Hadoop, Apache Spark, or Apache Kafka can be beneficial, as they integrate with Azure Synapse Analytics and enable large-scale data processing and analytics.
4. Having 2 to 4 years of experience in data visualization tools like Power BI or Tableau to create insightful visualizations and reports based on data stored in Synapse Data Warehouse.
5. Having 2 to 4 years experience in data cleansing, data profiling, and data validation techniques to ensure high data integrity in the data warehouse.

Education Required

This classification requires the possession of a bachelor’s degree in an IT-related or Engineering field. Additional qualifying experience may be substituted for the required education on a year-for-year basis.

Apply Now

Please send your resume and any additional information to our recruitment team at recruitment@nexlogica.com