Howard Coleman Posted March 16, 2021 Report Share Posted March 16, 2021 i currently run two "air" K Thermocouples and two EGT K Thermocouples into my G4X Extreme. all four show between 103 F and 99 F yet the actual ambient temperature is 54.2. the An Voltage readings square with the (erroneous) temperature reading. around .15/.16 V. 4 mv = one degree C 160 mv/4 = 40 C = 104 F i disconnected one of the Deutsche connectors between my Thermocouple amp and the ECU and found the ECU was reading .02 of one volt. this equates to 5C or 41 F. close enough... why does the ECU read any input voltage when disconnected to the input source? if this is the case what is your recommendation for a fix? thank you 10 COLEMAN MAR 14.pclx attached find log (graph) of the voltages... An V 5 and 6 are connected reading around 100 F and .15/.16 V (EGT Thermocouples) AN V 10 and 11 are disconnected and read .02 for the whole log. (Air Thermocouples) PC Datalog - 2021-03-16 6;54;38 pm.llgx Quote Link to comment Share on other sites More sharing options...
Adamw Posted March 17, 2021 Report Share Posted March 17, 2021 0.02V is the minimum voltage the analog inputs can detect. With nothing connected they will usually sit at 0.01 or 0.02. If you connect them to ground they should read zero. Obviously this will only affect measurements of temperatures below about 5°C which isnt relevant for an amp that is designed to measure EGT. As for the poor accuracy of the amplifier or thermocouple that is something you will have to discus with the manufacturers of those items. Of course you can fudge the ecu calibration if you want them to display correctly if you have equipment capable of determining the correct calibration. Quote Link to comment Share on other sites More sharing options...
Howard Coleman Posted March 17, 2021 Author Report Share Posted March 17, 2021 i am getting a steady .02 V (see log) when not connected and this voltage appears to continue to be additionally a portion of the value read by the ECU when connected. this is evidenced by temp readings of 100 F when the ambient is 52. .02 V is 5 C. 5 C being 41 F which squares with the error. while 41 F might not sound like as much as to an EGT reading (although it is) it certainly is significant when reading IAT which is a very important metric. it appears to me if my G4X Extreme is functioning similar to other Extremes that people should know their Temp readings are high by slightly over 40 F. there is nothing wrong with my thermocouples or my amps or my connections. i will adjust the calibration and all will be well. if this additional voltage is existent it would be valuable if others were apprised. Quote Link to comment Share on other sites More sharing options...
Adamw Posted March 17, 2021 Report Share Posted March 17, 2021 0.02V difference is 5deg C or 9 deg F, not 41 deg F. You don’t add the 32F offset when you are calculating a delta temp. And that 0.02V offset is only present when you have nothing connected. If your amp can actually output 0V (doubtful) then the ecu input will read 0V. Any time there is move than 0.02V on the pin it will read correctly. You can confirm with a multimeter if you like. Take note of the pin voltage the ecu is displaying, then unplug amp and check voltage it is outputing with a multimeter. I’m sure you will find they match. Quote Link to comment Share on other sites More sharing options...
Howard Coleman Posted March 17, 2021 Author Report Share Posted March 17, 2021 thanks for that. Quote Link to comment Share on other sites More sharing options...
Howard Coleman Posted April 14, 2021 Author Report Share Posted April 14, 2021 sorry to have detoured my issue with the .02 V non-issue. it appears to me i still have a problem. i currently use four K thermocouples. if i input the standard calibration (0 = 32 F, 5 = 2282 F) and just turn the key on and log the temp and the An V i get temp readings right at 100F and .17 V. the actual temp is 63F and the V should be less than half the .17. after discussing this w EGT Techology, with whom i have dealt for 20 years, they suggested i disconnect a couple of the amp outputs from the ECU and measure the voltage. the amp output was such that it was to the F degree of the actual ambient temp. at the same time i logged the output from the two connected (to the ECU) amps which read .17 or about 100F. i am currently testing very high level intercoolers and it is important to have solid data. by reducing the calibration for 0 V by 40 F i can match ambient in my shop but really have no idea what might happen in other parts of the scale. might there be something i have done to create this error, is it fixable? or can i just change the calibration. thank you Quote Link to comment Share on other sites More sharing options...
Adamw Posted April 15, 2021 Report Share Posted April 15, 2021 That seems odd. Does the amp have an analog ground that is connected to ecu senosr dround? The main issue is you are using a low accuracy device that is designed for measuring a temp range of ~1200deg but to measure a temp less than 1/10 of that and expect high accuracy. If you only need to measure up to 100degC or it would be suggested to make your own two point calibration by dropping the probes in a cup of ice water and a kettle of boiling water. If you need higher than 100°C then use some other fluid with a known higher boiling point. This will then give you an overall calibration that will compensate for the TC accuracy, Amp accuracy, ground offsets & ADC accuracy. Quote Link to comment Share on other sites More sharing options...
Howard Coleman Posted April 15, 2021 Author Report Share Posted April 15, 2021 did the icewater and boiling water thing, noted AN V #s. re-calibrated to those and i feel i am in good shape w re to accuracy. thanks Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.