The best candidate for this position would be a self-motivated individual who has a strong attention to detail and prior testing experience. The selected candidate will test large scale cloud systems and develop code in support of that effort.
• Required – Shall have at least eight (8) years experience in software development/engineering, including requirements analysis, software development, installation, integration, evaluation, enhancement, maintenance, testing, and problem diagnosis/resolution.
• Required – Shall have demonstrated experience working with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, CloudBase/Acumulo, Big Table, etc.
• Required – Shall have demonstrated work experience with the Map Reduce programming model and technologies such as Hadoop, Hive, Pig, etc.
• Required – Shall have demonstrated work experience with the Hadoop Distributed File System (HDFS).
• Required – Shall have demonstrated work experience with serialization such as JSON and/or BSON.
• Required – Shall have demonstrated work experience in the requirements analysis and design of at least one Object Oriented system.
• Required – Shall have demonstrated work experience developing solutions integrating and extending FOSS/COTS products.
• Required – Shall have at least three (3) years experience in software integration and software testing, to include developing and implementing test plans and test scripts.
• Required – Shall have demonstrated technical writing skills and shall have generated technical documents in support of software development project.
• Required – Experience developing and deploying: data driven analytics; event driven analytics; sets of analytics orchestrated through rules engines.
• Required – In addition, the candidate will have demonstrated work experience in at least four (4) of the desired characteristics.
• Desired – Experience developing and deploying: analytics that include foreign language processing; analytic processes that incorporate/integrate multi-media technologies, including speech, text, image and video exploitation; analytics that function on massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ semantic relationships (i.e., inference engines) between structured and unstructured data sets; analytics that identify latent patterns between elements of massive data sets, for example more than a billion rows or larger than 10 Petabytes; analytics that employ techniques commonly associated with Artificial Intelligence, for example genetic algorithms.
• Required – Shall have at least six (6) years of experience developing software with high level languages such as Java, C, C++.
• Required – Shall have demonstrated work experience developing Restful services.
• Required – Shall have at least five (5) years experience developing software for Windows (2000, 2003, XP, VISTA) or UNIX/Linux (Redhat versions 3-5) operating systems.
• Desired – Experience designing and developing automated analytic software, techniques, and algorithms.
• Desired – Experience with taxonomy construction for analytic disciplines, knowledge areas and skills.
• Desired – Experience developing and deploying analytics that discover and exploit social networks.
• Desired – Experience documenting ontologies, data models, schemas, formats, data element dictionaries, software application program interfaces and other technical specifications.
• Desired – Experience developing and deploying analytics within a heterogeneous schema environment.
• Desired – Experience with linguistics (grammar, morphology, concepts).
• Desired – Understanding of Big-Data Cloud Scalability (Amazon, Google, Facebook).
• Required – Hadoop/Cloud Developer Certification.
NOTE: A degree in Communications, Computer Science, Mathematics, Accounting, Information Systems, Program Management, or similar degree will be considered as a technical field.
CENTERPOINT Inc. is an equal opportunity employer.