About the job
Nivasoft (https://nivasoft.com/) is looking out for Certified Google Cloud Professionals - who can represent us in this opportunity (Google Partneship Program).
The initial budget would be on a retainer basis for 2 hours/day over 5 days being allocated for Nivasoft, mainly for non-tech work with respect to documentation, planning during the POC phase spanning 2 months. Timings are flexible.
Most important - they should represent Nivasoft in all means - on paper, with customer, their certificate should carry Nivasoft name which can be changed once the partneship process is over from our side. An MSA with respect to the same is already drafted.
Request you to be a part of the endeavor now and enjoy the journey ahead....
Enclosed below is the Job Description:
We are looking for GCP certified professionals / freelancers in Bangalore, India with experience on Google Cloud Platform, Python, Big Query, Dataflow, Google Data Studio. The resource must have worked on GCP real-time implementations and experience in Google Cloud Platform modules like Dataproc, Dataflow, Composer, BigQuery, BigTable, etc.
GCP certification is mandatory.
Skills Required :
• Certified Google Cloud Professional - with hands-on experience in GCP IaaS and PaaS Components, Containers, Kubernetes, Storage, Server, Applications to deliver end to end Cloud Infrastructure architectures and designs ;
• Designing solutions and architecting applications for the Cloud (High availability, Fault tolerant, Elastic, Secure) ;
• Participate in solution building for moving existing workloads and/or traditional infrastructure to the Cloud;
• Must be able to articulate the technical merits and value of Cloud computing and comparative in-depth analysis of GCP
• Good knowledge of Big Data processing using BigTable and BigQuery ;
• Experience on developing products using MapR, HBase, Spark.
• Experience on Google Cloud Platform, Python, Big Query, Dataflow, Google Data Studio.
• Must have worked on GCP real-time implementations
• Experience in Google Cloud Platform modules like Dataproc, Dataflow, Composer, Big Query, Bigtable, etc.
• Strong working experience with storage, querying, processing, analysis of big data and ETL
• Strong experience in Design, Development of Big Data solution with experience in Data Engineering, Data Analysis, Data modelling, Data Warehouse, Data security, ETL processing.
• Should have proficiency with at least one programming language - Python/Scala
• Perform complex day-to-day development/support for GCP cloud environment
• Develop, maintain and tune highly complex scripts using Python and Big Query
• Managing and designing solutions related to Data Engineering, Data Analysis, Data modelling, Data Warehouse, Data security, ETL
• CI/CD Implementation of cloud solutions
• Automation of Infrastructure deployment
• Experience in Databases (Relational, MPP, NoSQL
• Experience in Batch and Real-time Data processing (using products like Kafka)
• Expertise in ETL Tools (preferably cloud based solutions)
• Excellent Communication & Stakeholder management skills
• Should be able to recommend architectural best practices and suggest improvements for an existing application.