Would you like to be a part of an exciting START-UP project for one well-known, international and stable company in Brno? Are you an experienced SW Engineer with knowledge of Java and Python and do you have an experience with programming data pipelines and ETL technologies? If YES, this is a great career opportunity for you!
Analyzes, develops, designs, and maintains software for the organization's products and systems. Performs system integration of software and hardware to maintain throughput and program consistency. Develops, validates, and tests: structures and user documentation. Work is evaluated upon completion to ensure objectives have been met. Determines and develops approach to solutions.
Responsibilities:
- Big Data Processing:
- Design, develop, and maintain scalable data pipelines using technologies such as Apache Airflow, Google Composer.
- Implement ETL (Extract, Transform, Load) processes to ingest, clean, and process large volumes of data from various sources.
- Should have strong knowledge on Object Storage like Google Storage, or similar like S3, HDFS.
- Should have strong knowledge on any Streaming ETL technologies like Apache Beam, Google Dataflow, Spark, Databricks
- Should have strong knowledge on SQL/OLAP databases like Big Query, or similar like Hive.
- Optimize data processing workflows to ensure high performance and efficiency.
- Good to have knowledge on Cloud functions, Cloud build/Jenkins, Cloud Networking
- Desired Skills:
- REST API Development with Spring boot, Web flux.
- Knowledge on Web Security.
- Collaboration and Communication:
- Work closely with data scientists, engineers, and business analysts to understand data requirements and deliver solutions.
- Participate in code reviews, provide constructive feedback, and ensure coding standards are met.
- Communicate effectively with team members and stakeholders to ensure project goals and timelines are met.
Qualifications:
- Qualifications:
- Bachelor’s or Master’s degree in Computer Science, Information Technology, Data Science, or a related field.
- Strong programming skills in languages such as Python, Java.
- Experience with data analysis tools and libraries (e.g., Pandas, NumPy, Scikit-learn).
- Experience with cloud platforms (e.g. Google Cloud) and their data processing services.
- Excellent problem-solving skills and attention to detail.
- Strong communication and teamwork abilities.
- Preferred Qualifications:
- Experience with containerization and orchestration tools like Docker and Kubernetes.
- Familiarity with CI/CD pipelines and version control systems like Git.
- GCP Knowledge is preferable.
If the position interests you, apply directly or contact me for more information about the role!