Skip Navigation
Loading...

Globe Life Family of Companies Internal Careers

Making Tomorrow Better

Cloud Data Engineer (Hybrid)



Category

Information Technology

All Locations

3700 S. Stonebridge Dr., McKinney, Texas

Job Location

3700 S. Stonebridge Dr., McKinney, Texas

Posted Date

2/8/2024

Tracking Code

15528

Position Type

Full-Time/Regular

At Globe Life, we are committed to empowering our employees with the support and opportunities they need to succeed at every stage of their career. Our thriving and dynamic community offers ample room for professional development, increased earning potential, and a secure work environment.

 

We take pride in fostering a caring and innovative culture that enables us to collectively grow and overcome challenges in a connected, collaborative, and mutually respectful environment that calls us to help Make Tomorrow Better.

 

Role Overview:

Could you be our next Cloud Data Engineer? Globe Life is looking for a Cloud Data Engineer to join the team!

 

In this role, you will be responsible for supporting our Cloud Data Management and Advanced Analytics platforms. In this role, an employee will work with data services, data analysts and the data scientist team(s) to help the organization build out secure, scalable, fault-tolerant, and high performing cloud-based architecture to enable and sustain data-driven decisions.

 

This is a hybrid position located in McKinney, TX (WFH Monday & Friday, In Office Tuesday-Thursday).

 

What You Will Do:

  • Design, implement, and support various database services of our Cloud Platform.
  • Provide technological guidance on Data Lake and Enterprise Data Warehouse design, development, implementation, and monitoring.
  • Understand, Design and Implement Data Security around cloud infrastructures.
  • Provide support and guidance to Data Services and other application development teams on various AWS Database Products.
  • Work with leadership on process improvement and strategic initiatives on Cloud Platform.

 

What You Can Bring:

  • Bachelor's degree in Computer Science/Engineering, Information Systems or equivalent work experience in a technical position.
  • 8+ years of experience in Information Technology.
  • 5+ years of experience in Database Engineering primarily in AWS Redshift, RDS/Aurora, DynamoDB, DMS, Glue.
  • Proven experience in building data pipelines and database applications.
  • Strong coding and scripting experience with Python, PowerShell, or similar languages.
  • Experience of implementing Amazon EMR or Big Data technologies like Hadoop, Spark, Presto, Hive will be a plus.
  • Prior domain experience of Life Insurance, Annuity or Financial Services is a plus.
  • AWS Certifications are considered a strong plus.
  • Excellent verbal and written communication skills.
  • Ability to work independently and as part of team.
  • Extensive hands-on experience including design and implementation across broad range of database services on Amazon Web Services (AWS).
  • Experience with AWS Database Migration Service (AWS DMS) and migrating RDBMS (SQL Server/Oracle) from On-premise to AWS (Aurora/Redshift).
  • Solid understanding of various Data Management and Data Pipeline tools available in AWS.
  • Working knowledge with primary AWS Services like S3, Lambda, Batch, Glue, Athena, EC2, EBS, CloudWatch, CloudTrail, ECS, ECR, EMR, IAM, SNS etc.
  • Development experience with any major ETL tool, preferably Informatica.
  • Good understanding of implementing datalake and data warehouse in Cloud.
  • Experience in creating and deploying CloudFormation Templates (CFTs).
  • Experience with Lifecycle Management of S3 Buckets.
  • Clear Understanding of Cloud Databases Security involving AWS IAM Users and access, IAM Roles and Policies, Federated Users, and permissions.
  • Good Understanding of AWS Encryption methodologies and AWS KMS Services.
  • Experience with database performance testing and capacity planning.
  • Working knowledge and experience with software development life cycle (SDLC) and agile/iterative methodologies.
  • Implementation experience of Big Data technologies like Hadoop, Spark, Presto, Hive, and Hue will be a major advantage.
  • Knowledge of AWS Machine Learning Offerings will be a plus.
  • Experience with any Data Visualization tools will be a plus.

 

Applicable To All Employees of Globe Life Family of Companies:

  • Reliable and predictable attendance of your assigned shift.
  • Ability to work full time and/or part time based on the position specifications.

 

How Globe Life Will Support You:

Looking to continue your career in an environment that values your contribution and invests in your growth? We've curated a benefits package that helps to ensure that you don’t just work, but thrive at Globe Life:

  • Competitive compensation designed to reflect your expertise and contribution.
  • Comprehensive health, dental, and vision insurance plans because we believe that taking care of your well-being is fundamental to your performance.
  • Robust life insurance benefits and retirement plans, including company-matched 401k and pension plan.
  • Wellness club reimbursements and gym discounts to help you stay on top of your health.
  • Paid holidays and time off to support a healthy work-life balance.
  • Parental leave to help our employees welcome their new additions.
  • Development training programs to enhance your skills and career progression and unlock your full potential.

 

Opportunity awaits! Invest in your professional legacy, realize your path, and see the direct impact you can make in a workplace that celebrates and harnesses your unique talents and perspectives to their fullest potential. At Globe Life, your voice matters.

 

#DICE #AWS #DataEngineer #informatica #SQL

This position is located at 3700 S. Stonebridge Dr., McKinney, TX. View the Google Map in full screen.



close
Hi! We use cookies on this website to help operate our site and for analytics purposes. For more on how we use cookies and your cookie choices, go here! By continuing to use our services, you are giving us your consent to use cookies.