ISR Modeling and Simulation SME (Data Scientist) (MV) - 31608 in Arlington, VA at Alion Science and Technology

Date Posted: 9/25/2020

Job Snapshot

  • Employee Type:
  • Location:
    Arlington, VA
  • Job Type:
  • Experience:
    Not Specified
  • Date Posted:
  • Job ID:

Job Description

Job Responsibilities: 

You would be part of a team. The ISR collection management professionals will support the full scope of ISR operations and capabilities analysis of manned and unmanned terrestrial, maritime surface and sub-surface, airborne, and space-based sensors and systems by assessing the performance and effectiveness of ISR operations, force management, and the associated tasking, collection, processing, exploitation, and dissemination (TCPED) and supporting architectures. Analysts will develop and maintain methodologies, metrics, processes, and databases to assess ISR capabilities, utilization, and employment plans. Assessments are focused on how ISR satisfies the intelligence collection requirements for Geographic Command Commands and Functional Component Commands. Modelers will apply designated, non-proprietary hardware and software including physics-based ISR models, requirements scheduling software, and ISR optimization software.

Place of Performance: NMCC Pentagon Joint Staff J32

Mandatory Qualifications for the Intermediate ISR Modeling and Simulation SME:

  • All contractor personnel shall individually possess: 
  • Proficiency with Microsoft Suite software including Outlook, Word, PowerPoint, Excel, and Access. 
  • Demonstrated knowledge of Arc Geographic Information System (ARcGiS) and Google Earth or other similar visualization tools. 
  • Demonstrated written and oral communication skills that enable them to articulate requirements and procedures, convey taskings, and exchange information. 
  • Knowledge of the intelligence cycle and how the same relates to DoD collection operations, platforms, and sensors. 
  • Demonstrate experience with the following: IC, DoD, and global ISR intelligence disciplines such as GEOINT and SIGINT; Collection Requirements Management and associated software tools such as GIMS, PRISM, NSRP, and SAVANT; and Collection Operations Management and associated software tools such as ROME and BVI. 
  • Demonstrate experience with COTS data analytic applications such as SAS Federal, Tableau, and Spotfire.
  • Must possess a TS and have SCI eligibility 

 The following skills are highly desirable:

  •  Relevant current experience, defined as within the past five (5) years, in operational physics-based, discrete event step, high-fidelity modeling which shall include the following: 
  • Detailed analysis support for modeling individual and integrated systems of systems for existing, emerging, and conceptual global ISR systems and associated TCPED process; 
  • Environmental, system, and architecture modeling; 
  • Modeling and analyzing military ISR architecture, systems, and associated TCPED process; 
  • The entire technical development process to evolve and verify an integrated and life- cycle balanced set of system, people, product, and process solutions that satisfies the Government’s requirements; 
  • Python, Perl, HTML, C++, Java, and Visual Basic programming languages to build new and maintain existing methods to extract, transform and load data from a variety of ISR Web Services for modeling and analysis; and 
  • Conducting technical reviews of analysis output products including briefings and papers. 

Job Category: Involved in the analysis of unstructured and semi-structured data, including latent semantic indexing (LSI), entity identification and tagging, complex event processing (CEP), and the application of analysis algorithms on distributed, clustered, and cloud-based high-performance infrastructures. Exercises creativity in applying non-traditional approaches to large-scale analysis of unstructured data in support of high-value use cases visualized through multi-dimensional interfaces. Handle processing and index requests against high-volume collections of data and high-velocity data streams. Has the ability to make discoveries in the world of big data.  Requires strong technical and computational skills  - engineering, physics, mathematics,  coupled with  the ability  to code design, develop, and deploy sophisticated applications using advanced unstructured and semi-structured data analysis techniques and utilizing high-performance computing environments. Has the ability to utilize advance tools and computational skills to interpret, connect, predict and make discoveries in complex data and deliver recommendations for business and analytic decisions.  Experience with software development, either an open-source enterprise software development stack (Java/Linux/Ruby/Python) or a Windows development stack (.NET, C#, C++). Experience with data transport and transformation APIs and technologies such as JSON, XML, XSLT, JDBC, SOAP and REST. Experience with Cloud-based data analysis tools including Hadoop and Mahout, Acumulo, Hive, Impala, Pig, and similar. Experience with visual analytic tools like Microsoft Pivot, Palantir, or Visual Analytics. Experience with open source textual processing such as Lucene, Sphinx, Nutch or Solr. Experience with entity extraction and conceptual search technologies such as LSI, LDA, etc. Experience with machine learning, algorithm analysis, and data clustering.


Security Clearance: Top Secret/Sensitive Comp Info

Next Steps

Join the Alion Talent Network today and stay up-to-date on our openings as they continue to become available! As a member of our network, you will receive alerts with new job opportunities that match your interests and have the ability to share job opportunities through social media or email. Join now!

Whether you choose to apply or just leave your information, we look forward to staying connected with you.


Woman smiling