Jump to content

Oil Temp sensor calibration question


Pete_89t2

Recommended Posts

Hi, I've added an oil temperature sensor to my setup with a wired in Link G4+ Fury. The sensor is an AEM part # 30-2012, and I have the AEM's calibration data (ohms vs. temperature, C*).

Since it appears none of the built in calibrations from the drop down menus match this sensor, I'm using the Cal 10 table to enter AEM's calibration data for the sensor. Problem is at the lowest temperature point (-40*C), the resistance is 402,392 ohms (~402.4 K-Ohms), and I can't figure out how to get PCLink to accept a resistance value in the cal table 10 that is greater than 65K ohms. When I try, PCLink gives me a message saying the # must be less than 65K or something to that effect. Do any of the other user entry cal tables allow greater than 65K ohm entries, or do they all have teh same limitations?

As a temporary work around, I just built my curve starting at the -10*C point, which translates to approx. 64K ohms on the sensor cal. The sensor readings seem to be spot on, but I assume it won't be too accurate at any temps below -10*C, though I don't expect to ever drive this car when it's that cold. But it would be nice to be able to calibrate across the full range of the sensor if possible.

Link to comment
Share on other sites

The maximum ohms figure you can input in a cal table is 65000, this is because it is limited to 16bits resolution.  Even with a temp sensor that only goes to 65000ohm, when you put that in a circuit with a 1000ohm pull-up resistor your resolution is so poor that it is about the limit that is useful.  If you really wanted to you could convert to volts and do the cal in volts but 400Kohm would use up the full 4.99V input range so it wouldnt give you any ability to use the error detection for failed/disconnected sensor.  So unless you really need to measure your oil temp down to -40 I wouldnt suggest doing that, failsafes are far more important.

In other words, that sensor has been designed with a resistance curve to give good resolution and accuracy around normal oil operating temperatures - ie 80-150C, with no consideration for its ability to measure sub-zero temperatures.  If you convert to volts it outputs 4.988V at -40C and 4.983V at -35C, so only a 0.005V change for 5deg, which the ecu with 10bit ADC's will barely be able to measure.   So the approach you have taken by starting the calibration at -10°C is the correct solution.  That also gives you a little bit of voltage at the top end to incorporate error conditions.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

×
×
  • Create New...