Sign In
 [New User? Sign Up]
Mobile Version

Senior Data Engineer

Capital One


Location:
McLean, VA
Date:
03/23/2017
Job Code:
capitalone2-R18333
Categories:
  • Engineering
  •  
  • Save Ad
  • Email Friend
  • Print
  • Research Salary

Job Details

Company Capital One

Job Title: Senior Data Engineer

JobID: capitalone2-R18333

Location: McLean, VA, 22106, USA

Description: McLean 1 (19050), United States of America, McLean, Virginia



Senior Data Engineer



Do you want to be on the forefront of the next BIG thing in data? Do you love designing and implementing business critical data management solutions? Do you enjoy solving complex business problems in a fast-paced, collaborative, and iterative delivery environment? If this excites you, then keep reading!



Help us use technology to bring ingenuity, simplicity, and humanity to banking, we're seeking a Data Engineer to mentor other engineers and provide hands-on expertise and design for the team.



It's all in the name: Deliver. In this role, you will be responsible for taking a hands-on approach to building teams to define and DELIVER against a robust technology roadmap. The right candidate for this role is someone who is passionate about their craft, welcomes challenges, thrives under pressure, and is hyper-focused on delivering exceptional results while creating a rewarding team environment.



The Job & Expectations:



- Collaborating as part of a cross-functional Agile team to create and enhance software that enables state of the art, next generation Big Data applications



- Build data APIs and data delivery services that support critical operational and analytical applications for our internal business operations, customers and partners



- Utilizing programming languages like Python, Java, Scala, and databases such AWS Redshift, Aurora, Open Source RDBMS and NoSQL databases like PostgreSQL



- Developing and deploying distributed computing Big Data applications using Open Source frameworks like Apache Spark, Flink and Kafka on AWS Cloud



- Develop and implement data-enabling software utilizing open source frameworks or projects such as Spring, Angular JS



- Leveraging DevOps techniques and practices like Continuous Integration, Continuous Deployment, Test Automation, Build Automation and Test Driven Development to enable the rapid delivery of working code utilizing tools like Jenkins, Maven, Nexus, Ansible, , Git and Docker



- Research and explore new technology patterns and frameworks when existing enterprise frameworks are not sufficient to meet customer needs.



- Ability to handle multiple responsibilities in an unstructured environment where you're empowered to make a difference. In that context, you will be expected to research and develop cutting edge technologies to accomplish your goals.



- Data mining, machine learning, statistical modeling tools or underlying algorithms.



Your interests:



- You geek out over obscure sports statistics. You ponder what drove your streaming music service's algorithms to accurately predict that you'd like the song you're listening to. Nate Silver asks you who's going to win the next election. You love data.



- You get a thrill out of using large data sets, some slightly messy, to answer real-world questions.



- You yearn to be a part of cutting-edge, high profile projects and are motivated by delivering world-class solutions on an aggressive schedule.



- You love building and leading teams to deliver world class solutions in a fun and rewarding environment.



- You are passionate about learning new technologies and mentoring more junior resources.



- Humor and fun are a natural part of your flow.



\#ilovedata #bigdata #datawarehousing \#transforminganalytics



Basic Qualifications:



- Bachelor's Degree in Computer Science, Computer Engineering, Management of Information Systems or military experience



- At least 3 years of hands on work experience in developing, deploying, testing in AWS architecture



- At least 4 years of professional work experience delivering big data solutions



- At least 4 years of professional work experience delivering data management



- At least 4 years of professional work experience delivering data warehousing



- At least 2 years of professional work experience delivering Software Solutions within an Agile delivery environment



Preferred Qualifications:



- Master's Degree in Information Systems or Computer Science



- 2+ years experience with the Hadoop stack (MapReduce, Spark, Pig, Hive, Hbase, Sqoop)



- 2+ years experience delivering Java based software solutions



- 4+ years experience in at least one scripting language (Python, Scala)



- 4+ years experience developing software solutions to solve business problems



At this time, Capital One will not sponsor a new applicant for employment authorization for this position.



At Capital One, we’re building a leading information-based technology company. Still founder-led by Chairman and Chief Executive Officer Richard Fairbank, Capital One is on a mission to help our customers succeed by bringing ingenuity, simplicity, and humanity to banking. We measure our efforts by the success our customers enjoy and the advocacy they exhibit. We are succeeding because they are succeeding.                                               


Featured Employers

Featured Jobs

CareerConnection Video