banner

Job Description:

  • Snowflake SQL – Writing SQL queries against Snowflake Developing scripts Unix, Python etc. to do Extract, Load and Transform data

Qualifications:

  • Must have good knowledge of architecting and implementing very large-scale data intelligence solutions around Snowflake Data Warehouse.
  • Minimum 3 years’ experience in Snowflake. Hands-on experience with Snowflake utilities such as SnowSQL, SnowPipe, Python, Tasks, Streams, Time travel, Optimizer, Metadata Manager, data sharing, and stored procedures.
  • In-depth understanding of Data Warehouse/ODS, ETL concept and modeling structure principles• Experience in Data warehousing – OLTP, OLAP, Dimensions, Facts and Data modeling.
  • Experience gathering and analyzing system requirements
  • Good working knowledge of any ETL tool (such as Talend, Informatica and SSIS)
  • Good to have familiarity with data visualization tools (Tableau/Power BI)
  • Proven analytical skills and Problem-solving attitude
  • Ability to effectively function in a cross-team’s environment
  • Good to have exposure to AWS / Azure Data pipeline eco-system
  • Experience in implementing solutions for Retail, Finance, Insurance and Healthcare domains

Job Description:

  • Design, develop, test, and deploy Power BI scripts. Creation of Power BI dashboard, report, KPI scorecard and transforming the manual reports, support Power BI dashboard deployment.
  • Lead the evaluation, implementation and deployment of emerging tools and process for analytic data engineering in order to improve our productivity as a team

Qualifications:

  • Experience in multiple database technologies such as MS SQL Server, Oracle, MySQL, PostgreSQL, AWS Redshift.
  • Expertise in SQL, Python and data analysis.
  • Knowledge of Azure
  • Experience with BI tools such as Tableau, Power BI, Looker, Shiny
  • Knowledge of data and analytics such as dimensional modelling, reporting tools, data governance, structured and unstructured data.
  • Knowledge about DAX queries in Power BI desktop.
  • Experience in data preparation, BI projects.
  • Strong analytical and problem-solving skills
  • Strong exposure to Visualization, transformation, and data analysis
  • Good communication and collaboration skills

Job Description:

  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using Python & SQL.
  • Work with Data Scientists/BI teams to assist with data-related technical issues and support their data processing needs.
  • Build analytical tools that utilize the data pipeline to provide actionable insights into key business performance metrics.

 

Qualifications:

  • Proficient in Python and SQL.
  • Good understanding of Relational databases, Big data environments
  • Proficient in building end-to-end data pipelines.
  • Experience Handling end to end development of projects without any additional support
  • Excellent communication skills and proven ability to handle customers.
Newsletter

Subscribe for Newsletter