返回查询:Telematics 能网联测试工程师(外企研发中心)G00375 / 北京

该职位来源于猎聘 Typically the telematics system software under test is provided in regular release cycles for each system generation. For each of these releases the following tasks needs to be performed:

  • Organization and preparation of test drive: The test equipment including test vehicles and test benches shall be updated prior to test drive. The SW update shall be done via diagnostic tools which provided by Daimler.

After SW updating a quick test shall be performed and the SW version of each telematics components shall be aligned with release configuration. The testing engineers responsible for the individual applications should be invited to join the test drive and communicated in advance for clarifying testing schedule and different testing requirements. Based on that, a comprehensive overview of test drive invitation including hand-out shall be prepared and shared with all involved parties.

  • A set of test cases has been defined. This set contains of basic test cases which are used for the basic feature and acceptance test. Furthermore, this set contains specific application test cases which have been defined and documented based on specific testing experience and refers to test cases which had been caused special attention in the past.

Depending on the focus of a specific release test, defined sets of test cases shall be performed based on the alignment with the responsible feature owner.

  • Cross-application customer testing: Beside the feature based application testing based on defined test cases, each application needs to be evaluated from a "customer perspective" considering cross-application use cases. This type of testing is not based on pre-defined test cases but requires a deep understanding of the related application, as well as experience and know-how of the test engineer. The testing of the feature clusters shall be documented in an appropriate way based on the alignment with the feature owner.
  • HMI validation for the respective application: For each application a clearly defined UI (User Interaction) concept is defined. The related specifications shall be evaluated during the application testing (focus stays on the functional application)
  • Documentation of defects: In case of any in-consistency with the specification or error case, the related test case shall be documented precisely. This includes a clear description of the pre-conditions, the error path, the misbehavior, the expected behavior, the specification reference and the documentation of required log files. All defects shall be documented in the web based defect management data base DANTE.
  • Verification of fixed tickets: Once the system supplier has fixed the documented defects, the status will be changed from OPEN to FIXED. Such defects shall be verified thoroughly. The results shall be documented in DANTE (verified > closed or re-open)
  • Maintenance and extension of test specifications: The test specification is dynamic document which needs to be maintained frequently. Based on the experience of the test engineer, the engineer shall give feedback to the test specification and support the feature owner in extending / correcting the existing test specification.
  • Reporting and improvement suggestions of system weaknesses Deviations between specification and implementation will result in defects which need to be fixed by the supplier. If the system behavior is according specification, but not customer-friendly form the perspective / experience of the test engineer, the engineer shall feedback accordingly to the feature owner in order to initiate system improvements in the future.
  • Testing equipment maintenance: The testing equipment shall be always up to date according configuration of each SW release cycle. Based on specific request from feature owner/tester, the telematics components might be updated/downgraded for different testing purpose