Sorry, this job is no longer available.

(Loading More Opportunities)

Edge Developer / Engineer for Edge AI / Processing

Job Responsibilities

  • Developing, Optimising Code for Size, Memory , Data Transfer, Bandwidth optimising , Network detection & auto switching for deploying on Edge devices with limitations
  • Hands on Working experience to one / more of the following Edge AI platforms : e.g. ,, or Google Coral Edge, IBM Edge, Azure Percept etc. (for both the with Reference Hardware + Platform)
  • Strong Python software development skill.
  • Expert level knowledge of Linux from System Administration to trouble shooting.
  • Knowledge in Containerization and Orchestration.
  • Understanding of RF properties of 5G networks, especially regarding latency/bandwidth.
  • Knowledge on emerging IoT and Edge / Mobile Technologies and Communication Protocols.
  • Embedded C/C++, ARM and Intel CPUs, Linux or RTOS
  • Knowledge of devices/sensors, M2M, SCADA, edge computing, wireless common protocols
  • Advanced SQL RDBMS design and query building skills (Oracle, SQL Server, Redshift, etc.)
  • Having knowledge of NOSQL databases is preferable
  • Sound knowledge of standard ETL practices and building data processing pipelines
  • Experience profiling, manipulating, and merging massive data set using Big Data technologies, preferably spark in Data Bricks at least one of AWS, Google, Microsoft, or Cloudera.
  • Visualization tool experience, especially with Tableau or Power BI

Desired Skills and Experience

  • 6+ years of experience in working with IoT Connectivity Protocols like HTTPS/MQTT/Modbus/OPC UA PUB SUB/ IEC 61850 etc.
  • 3+ years of work experience in working with IoT Platforms like Azure IoT Hub, AWS IoT Core, Bosch IoT Suite, THingworx Platform etc
  • 3+ Years of experience as data engineer preferably from data science groups
  • Be the key anchor for data extraction, preparation and hosting processes especially for Imagery / LiDAR.
  • Create master data files from disparate data sources by building data pipelines.
  • Develop and test architecture for Automated data extraction/processing.
  • Provide continuous connectivity to a master data source and access to refreshed data
  • Ensure data quality and reliability
  • Data expert for datasets used by data scientists
  • Able to understand the basic requirements of data science methodologies to collaborate with data scientists

bangalore, KA, IN