Dec 11, 2008

Here is the beginning of my post. And here is the rest of it.

Avoid excess pressure in a cylinder for a application by PRV PRESSURE REDUSEING VALVE


Here is the beginning of my post. And here is the rest of it.

piston seal and the wiper seal are used for sealing the oil flow between the cylinder.

O ring are used for the proper seating of different step in a cylinder which protect the oil from the leakage.

if the cylinder are in the loaded condition , we cannot easily remove the cylinder from the real application.

Design a cylinder for the required application to avoid consistent maintainence.


Here is the beginning of my post. And here is the rest of it.


Hydralic systems are the powerful system which give a highest power drive to do the required job.


For lifting a 5 ton object it is easy to use 80 bar pressure.

Dec 7, 2008

rtrtrtrt


karthikesdkajdhkdhzxakdhaskdas.

Nov 5, 2008

About PLC Command Instructions and Explanations



This web site contains general information about PLC Command Instructions and explains how they function in your application program.

These Commands are based on Allen Bradley's SLC-500 PLC and their RSLogix-500 Software.


All reference material contained within this site is for education.

Nov 4, 2008

Instrumentation and control is a good branch


scope of Instrumentation and Controls

There are lots of jobs out there for instrumentation and control types, and in every industry including power, refinery, chemical, brewing, material handling, mining, aeronautical, defense and etc, etc.

All mechanical and electrical systems require their own specific instrumentation and control systems in order to function efficiently, including piping, hydraulic, and electrical distribution, and every major construction company has a major need for people to design and install those systems. In addition as newer technological advances in instrumentation occur ther is a big business in reworking and modernizing older systems.

The basics
The sensing and control of product levels in containers involves a wide range of materials-liquids, powders, slurries, and granular bulk. All level measurement involves the interaction of a sensing device, element, or system with material inside a container. You can use a wide variety of physical principles to measure level-sight, pressure, radiation, and electric and sonic principles.

Three sight-type level sensors are glass gauges, displacers, and tape floats. Glass gauges are the most widely used instruments for measuring process tank level. Two types of level glass gauges measure liquid level in process tanks: tubular and flat gauges. The tubular type works on the same principle as a manometer. As the liquid level in an open tank rises or falls, the liquid in the glass tube will rise or fall. The gauges consist of glass, plastic, or a combination of the two materials.

Tape float
One of the most simple, direct methods of float level measurement is the tape float gauge. A tape connects to a float on one end and to a counterweight on the other to keep the tape under constant tension. The float motion results in the counterweight riding up and down a direct-reading gauge board, thereby indicating the level in the tank.
Standard floats are normally cylindrical for top-mounted designs and spherical or oblong for side-mounted designs. Small-diameter floats see use in higher density materials. You can use larger floats for liquid-liquid interface detection or for lower density materials.

Pressure-type instruments
Another common example is closed-tank level measurement. If the pressure in the closed tank changes, an equal force applies to both sides of the differential pressure (dP) transmitter. Because the dP cell responds only to changes in differential pressure, a change in static pressure on the liquid surface will not change the output of the transmitter. Thus, the dP cell responds only to changes in liquid level when the specific gravity of the liquid is constant.
Bubblers

The air bubbler is another pressure-type level sensor where you install a dip tube in a tank with its open end a few inches from the bottom. A fluid forces itself through the tube; when the fluid bubbles escape from the open end, the pressure in the tube equals the hydrostatic head of the liquid. As liquid level (head) varies, the pressure in the dip tube changes correspondingly.

For tanks that operate under pressure or vacuum, installing a bubbler system becomes slightly more complex, because the liquid level measurement is a function of the difference between the purge gas pressure and the vapor pressure above the liquid. Because differential pressure is now involved, the transducer used is normally a dP cell.

One disadvantage of using a bubbler is limited accuracy. Another is bubblers will introduce foreign matter into the process. Liquid purges can also upset the material balance of the process, and gas purges can overload the vent system on vacuum processes. If the purge medium fails, not only do you lose the level indication on the tank, but you also expose the system to process material, which can cause plugging, corrosion, freezing, or safety hazards.

Capacitance probes
A variety of instruments and sensors use basic electrical principles to measure and detect level. A capacitor consists of two conductors separated by an insulator. We call the conductors plates and refer to the insulator as the dielectric. The basic nature of a capacitor is its ability to accept and store an electric charge. When a capacitor connects to a battery, electrons will flow from the negative terminal of the battery to the capacitor, and the electrons on the opposite plate of the capacitor will flow to the positive terminal of the battery. This electron flow continues until the voltage across the capacitor equals the applied voltage.
You measure capacitor size in farads. A capacitor has the capacitance of 1 farad if it stores a charge of 1 coulomb when connected to a 1-volt supply. Because this is a very large unit, we commonly use one millionth of it, noted as a microfarad. The electric size in farads of a capacitor is dependent on its physical dimensions and on the type of material (dielectric) between the capacitor plates.

Resistance tapes
The resistance tape spirally winds around a steel tape. This instrument mounts vertically from top to bottom on a process tank. The pressure of the fluid in the tank causes the tape to short-circuit, thus changing the total resistance of the measuring tape. An electronic circuit measures the resistance; it's directly related to the liquid level in the tank.

Ultrasonic level measurement
Ultrasonic level sensors measure the time required for sound waves to travel through material. Ultrasonic sound waves generally have frequencies above 20 kilohertz. Ultrasonic instruments operate at frequencies inaudible to the human ear and at extremely low power levels, normally a few thousandths of a watt. The velocity of a sound wave is a function of the type of wave transmitted and the density of the medium in which it travels.

When a sound wave moving in a medium that transmits sound strikes a solid medium, such as a wall or a liquid surface, only a small amount of the sound energy penetrates the barrier, reflecting a large percentage of the wave. The reflected sound wave is called an echo.
A generator and transmitter produce the sound waves, and a transducer sends out the sound. The measured material or level reflects the sound waves. A transducer senses the reflected waves and converts the sound wave into an electrical signal, which it amplifies and sends to a wave-shaping circuit. A timing generator synchronizes the functions in the measurement system. The instrument measures the time that elapses between the transmitter burst and the echo signal. This elapsed time is proportional to the distance between the transducers and the object being sensed. The instrument is easily calibrated to measure fluid or material level in a process vessel.

Radiation-type instruments
Nuclear radiation instruments have the ability to sec through tank walls and can be mounted on the outside of process equipment. This reduces installation and repair costs. These systems can detect the level of liquids, bulk solids, and slurries.

Nuclear systems use a low-level gamma-ray source on one side of the vessel and a radiation detector on the other side. You can obtain a more accurate level measurement by placing several gamma sources at different heights. The material in the tank has a transmissibility different from that of air, so the instrument can provide an output signal proportional to the level of the material in the container.

Intrinsic safety defined

Hazardous locations are present in industries such as munitions, petrochemical, auto (paint spray booths), grain, wastewater, printing, distilling, pharmaceutical, brewing, cosmetics, mining, plastics, and utilities.

ISA-RP12.6 defines intrinsically safe equipment as "equipment and wiring which is incapable of releasing sufficient electrical or thermal energy under normal or abnormal conditions to cause ignition of a specific hazardous atmospheric mixture in its most easily ignited concentration." You can achieve this by limiting the power available to and generated by electrical equipment in the hazardous area to a level below that which will ignite the hazardous atmosphere.
The European standards define the general specifications and the detailed guidelines for methods of protection against explosion. The national requirements primarily contain installation requirements.

In the past, the U.S. and Canada have classified hazardous areas by classes, divisions, and groups. Although this system is still in use, North America is gradually beginning to adopt a classification system based on zones as standardized in many countries of the world.

Common instruments in hazardous areas

Switches-Include push buttons, selector switches, float switches, flow switches, proximity switches, and limit switches.

Thermocouples-Inexpensive temperature sensors constructed of two dissimilar metals that generate a millivolt signal varying with temperature.
I/P converters-Convert a direct current milliamp signal to a proportional pneumatic output signal, which usually positions a control valve.
Transmitters-In control systems, they convert a process variable to a proportional electrical signal. The electrical output is a 0/4-20 mA, 0/1-5 volt (V), or 0/2-10 V signal.

RTDs-Resistance temperature detectors (RTDs) convert temperature into resistance. A resistance change could be 0.385 ohms/آ °C for a 100-ohm platinum RTD.
Light-emitting diodes (LEDs)-Don't use standard incandescent bulbs in explosive areas because of radiant heat, current requirements, and the susceptibility of the bulb to breakage.

Solenoids-Electrically actuated valves allow full flow or no flow of gases or liquids. Don't use standard 24 volts direct current solenoids in the hazardous area due to the coil's energy storing capacity.

IS solenoids-To design for IS certification, one common procedure is to embed two diodes connected in parallel to the coil. These diodes eliminate the potential arcing if a wire were to break. They suppress the arc and provide the solenoid with a low inductance rating.

Strain gauges-Measure stress, force, weight, and pressure in load cells, scales, and transducers.

Potentiometers-Adjustable resistors with resistance value (ohms) that changes with mechanical wiper movement.


Audible alarms-Horns or buzzers signal a hazardous event has occurred. Typically, barrier choice would be the same for audible alarms as it is for solenoids.
Serial communications-Transferring data in a sequence of bits, generally in the form of a low voltage signal (0-15V), the most common serial communications protocol is RS-232.

Fire detectors-Detect flames in a hazardous environment. In the normal state, a low current (4-6 mA) passes through the detector circuit

A level experience
If you want to control your level problems, you need to understand which level technique to use in a particular application. While there is no perfect level control for all applications, reviewing the weaknesses of a technology and comparing to specific application parameters will yield insight into its potential for success.
A popular choice for high alarm or spill protection, on-off type devices or switches only indicate the presence or absence of product at a certain point, shutting off a pump or triggering an alarm if the fluid level in a tank gets too high. These usually go by the name of dumb switches with no self-diagnostics and no way of communicating if they are working. You must physically test them. One example is a simple mechanical float switch.
On one hand, you need to check level devices regularly if they don't have a self diagnostic. On the other hand, a contact ultrasonic or gap switch is your best choice in smart switches, which monitor themselves, sending an alarm if they wander out of specification. Proportional devices or transmitters reveal exactly how much product is in a tank. With them, information can transmit to other devices. You can use them for control or inventory.

Buoyancy

Some ancient technologies are still valid today, such as the simple principle of a float; as the fluid level rises, so does the buoyant float. The variable is merely how the, motion of the float translates into a control action. Some applications find the mechanical linkages converting the float's up-and-down motion into a contact closure or opening. Look at the float in a toilet tank. In applications requiring isolation of the stored fluid, you can use magnetic coupling to seal the liquid.
These magnetically linked devices see more use in industrial applications with high pressures or hazardous fluids. The displacer method, a variation of float technology, uses heavier-than-liquid displacers where an upand-down motion actuates a switch. Here, displacers connect in line to a spring using a suspension cable and position themselves to rise at a force proportional to the displaced volume of the liquid. Magnetic coupling to the switch is also possible, allowing the liquid to isolate from the controls.

Floats and displacers are easy to use and don't require power to operate. Floats need no calibration while you can calibrate displacers without level movement. Floats provide an accurate, repeatable set point. Displacers can have a number of on/off ranges within a single vessel if you need control of multiple levels. Because displacers are heavier than the liquid they control, they don't bob with wave or surge action. Switch short-term cycling is not a problem. Surface turbulence and foam don't impede displacers or floats. Displacer units can be continuous level transmitters or switches. Buoyancy methods are usable in applications up to 5000 psi and 1000آ °R

Buildup and deposits are a problem and can impede performance. Floats and displacers work only with low viscosity liquids; viscous and dry media require other methods. Liquids with the potential for buildup or those with suspended solids can cause hang-up in the sensors' moving parts.

Magnetostrictive

Magnetostrictive transmitters detect level by transmitting an electromagnetic pulse down a wire. A donut-shaped float with an internal magnet moves with the level. A magnetic field creates a twist on the wire at the point of the level. When the electromagnetic pulse encounters this magnetic field, a pickup in the transmitter propagates and reads an acoustic pulse. This allows you to use the float as a transmitter rather than just a switch. The cost is reasonable, and the device has a high accuracy, but only for clean fluids. Similar devices use a chain of reed switches instead of a magnetostrictive element to save cost, but it reduces accuracy.

Magnetic level indicators
Magnetic level indicators (MLIs) consist of a float with a magnet dropped into an isolating pipe or bridle connected to the side of a tank. The bridle can be any plastic or non-magnetic metal. Fluid will rise and fall in the bridle, matching the level of fluid in the tank. Highly visible magnetic flags go outside the pipe to indicate the fluid level. These devices are safer and easier to read than a sight glass. They also can easily retransmit the signal by adding switches, or transmitters, which clamp on to the outside of the pipe and pick up the magnetic field from the float.You can remove the magnetostrictive transmitter float and clamp it on to the outside of the MLI, creating a transmitter that doesn't have to be inserted into the vessel. You can also install a guided wave radar (GWR) transmitter directly into the MLI for increased integrity due to its redundant measurement.

MLIs can see use in temperatures up to 1000آ °F and are easier to read and safer than a glass sight tube. You can add alarm switches and transmitters by simply clamping them on to the outside of the bridle any time. MLIs are a piece of pipe you can build in many configurations. You don't need power for local indication. They are suitable for cleaner, low solids applications where there is little risk of the float building up or getting stuck in place.

Capacitance

Capacitance (RF or admittance) is a flexible level measurement technique that works for liquids, solids, corrosive materials, high temperatures, and pressures. However, some application sensitivities and calibration issues exist. Cumbersome calibration, dielectric shift, and buildup on the probe are key issues.
In all cases, the devices measure a change in pico farads (pF), a unit of capacitance. A simple metal rod, coated with insulation when used in electrically conductive fluids, turns the storage vessel into a large capacitor. Any material added to the tank will have a higher electrical dielectric than the air it displaces, so increasing the level of the product increases the amount of capacitance. You can make on/off or continuous measurement any-; where on the probe.

Ultrasonic
Ultrasonic level measurement techniques include sending a sound wave through air (air sonar), or liquid (liquid sonar). A sound pulse (usually ultrasonic) is transmitted, and you can time the return reflection (or echo) from the surface of the liquid to determine distance. Liquid sonar devices (gap switches or contacting ultrasonics) typically see use as a switch to detect the presence or absence of fluid in a notch in the probe. Both types use a piezo crystal to generate the pulse. Non-contact ultrasonic measurement is especially suitable for corrosive and dirty applications, as well as for liquids, slurries, and bulk, solids.
Contact ultrasonics are useful in high alarm and overfill applications as they have diagnostics to self-check and ensure reliable operation and are relatively inexpensive. Ultrasonic non-contact units are limited to applications under 50 psi, 300آ °F and are not reliable in the presence of heavy surface foam. Interference from falling liquids, steam, dense vapors, and dust can affect the signal propagation, as can obstructions in the vessel. Gap switches need to avoid build-up of material in the sensor gap and are typically limited to 325آ °F and 1500 psi.

Radar
Radar level measurement is based on measuring the transit time of high frequency (GHz) electromagnetic energy transmitted from an antenna at the top of the tank and reflecting off the surface of the level medium; the higher the dielectric of the medium, the stronger the reflection. Radar is robust, reliable, and becoming popular as prices decline. Line-powered and loop-powered products now offer a wide range of flexibility to the user in hazardous and non-hazardous areas.
Today, radar comes in two forms: non-contact (through air) and contact (guided wave). The transmitted energy travels freely over long distances (greater than 200 ft) and is unaffected by changes in temperature, pressure, of vapor density above the medium. Non-contact radar measures effectively in applications of varying process media conditions like dielectric or specific gravity, and you can use it in corrosive environments. High temperature (750آ °F) and high pressure (5000 psig) are possible.

Reliably picking the level signal out of the background noise (false targets) is a difficult and often unreliable process. Performance can deteriorate significantly in the presence of mixing blades.

Contact or GWR uses a probe or waveguide to conduct the signal to the surface and back. GWR can measure in almost any application less than 100 ft and will work in many situations through air radar and other technologies; cannot. This is due to the increased transmission efficiency the metal waveguide offers.
You can install GWR and get it working with little or no calibration because the signal does not spread away from the antenna at launch; false target rejection is not an issue with GWR. It is easy to handle extremely low dielectric (e >1.4) media, turbulence, foam, tank obstructions (false targets), and fast-moving levels. High temperature (750آ °F) and high pressure (5000 psig) applications are common. In many applications, coating/buildup on the probe causes no significant error. GWR can measure accurately and reliably up to the very process seal of the probe. It's excellent for applications where overfill is a problem. Further, GWR has the ability to measure fluid/fluid interface applications of low dielectric over high dielectric media. Using contact and non-contact radar judiciously can be effective in most process level measurement applications.

Differential pressure

Differential pressure (DP) is a popular choice for clean liquids with a constant specific gravity. DP transmitters do not measure level directly; they instead infer level by the downward pressure or weight of the liquid against a diaphragm. If the temperature or specific gravity of the medium changes, significant error will occur. If the vessel is pressurized, you need to add a second connection to the vessel above the liquid to measure and allow for correction of this variable, hence the term differential pressure, which measures the difference between these two points. You can use DP for flow measurement by inserting a device into the line to create a pressure drop proportional to flow. DP devices connect to the vessel below the liquid surface, increasing the likelihood of leaks and making it difficult to remove if it needs service or cleaning.

Transmitters aid interface level
measurement

Many processes use water as a means of transporting product from one point to another For example, in oil production, water or steam is often used to lift oil out of a well.

In chemical production, water is sometimes a byproduct or a tool used to clean a vessel.
In these situations, the water and hydrocarbons will mix together.
At some point, it will be necessary to remove the hydrocarbon from the water.
If allowed to settle undisturbed in a tank, the mixture will separate into its two components, with the heavier, denser material sinking to the bottom and the lighter, less dense material rising to the top.
This principle is exactly the same as the way in which oil and vinegar separates in an Italian salad dressing.
One example of this in a real application is a separation tank.
A control valve regulates the ingress of a liquid mixture of water and hydrocarbon into the vessel.
Eventually, the lighter material in the mixture finds it way up to the separation stack, where a water/hydrocarbon liquid interface forms - effectively a dividing line between the two liquids.
The position of this liquid interface is critical - too little or too much either way will end up with either water being drawn out with the hydrocarbon, or hydrocarbon remaining in the tank.
In either situation, the end result is reduced product quality and process efficiency, adding to the product cost.
When the mixture gets to the critical interface point, a pump will pull out the hydrocarbon from the stack while a continuous amount of new mixture is pumped into the tank.
The hydrocarbon is then sent on for processing, free of water.
For this process to operate at optimum efficiency, it is vital that the interface level is measured and controlled properly.
A range of different technologies exists for interface level measurement applications.

Many of these technologies can encounter problems when either the interface level becomes too small or the process involves sticky solids.
Substances that can coat or leave residue can also present a problem when using these devices.

Here, we will look at the advantages and disadvantages associated with the three main methods most commonly employed for interface level measurement, namely: displacers, capacitance probes and differential pressure transmitters.
Displacer type transmitters rely on the principle of buoyancy and consists of a large chamber flanged to the separation stack.
A float or element of a known specific gravity will float at the point of interface.
A series of moving part linkages attached to the float indicate the float's position to a transmitter, informing it of where the interface is.
Although relatively straightforward, this technique has a number of key disadvantages.

First, petrochemical and chemical applications are often characterised by aggressive conditions, demanding the use of exotic materials, which can add substantially to the cost of the transmitter system.

The mechanical linkages can also stick, fouling the measurement and requiring frequent maintenance.

The overall accuracy of these devices is also often questionable - in some cases, customers have reported accuracies of just 10% at best.
Capacitance probes comprise a long metallic probe, which normally enters the top of the separator vessel and extends to its lowest point.
Liquid level and interface are detected by measuring the capacitance value between the wall of the vessel holding the liquid and the probe itself.

Again, the aggressive nature of most chemical and petrochemical applications will require the use of exotic materials, adding to the cost of the installation.
Another complication associated with this technology is the measurement of sticky substances, which can coat the metal, resulting in measurement uncertainties and poor readings.

Other factors such as foam on the liquid surface or vibration of the tank can also conspire to reduce measurement certainty or even render the probe inoperable.
Remote seal differential pressure transmitters probably offer the best solution for the measurement of liquid interface levels.

With this technique, when the distance between the taps on the separation stack is filled only with the lighter liquid, the differential pressure is minimum value or the lowest range value (LRV) of the transmitter.

When it is filled with the heavier liquid, the differential pressure is at its maximum value, or the upper range value (URV) of the transmitter.
Although this technique overcomes many of the problems associated with the previously described methods, particularly with respect to corrosion, it does have one main drawback.

The small difference in both the specific gravity of the two liquids and the distance between the taps on the separation stack results in a very small differential pressure span.

In many cases, the size of this span is often lower than the recommended minimum span for most remote seal transmitters.

One way of overcoming this problem is to use remote seals and transmitters which are sensitive enough to detect very low span changes.
An example is ABB's own remote-seal based 2600T interface level transmitter, which has been specifically designed for use at very low differential pressures.
These transmitters use a remote seal with a highly sensitive diaphragm available with a range of fill fluids for a variety of applications.
Protection against leakage of the fill fluid is ensured by an all-welded construction, which offers a significantly extended service life than seals using a conventional gasket or thread construction, particularly in vacuum applications.
A chemical plant wanted an interface level transmitter for use in a chemically aggressive hydrocarbon reprocessing application.
In this application, a mixture of process hydrocarbons cleaned from the plant's tanks and reactors, and water used for cleaning the reactors, was piped into a holding tank where it was allowed to settle.
The customer wanted to be able to pump the hydrocarbon back into the process for reclamation without also pumping any of the water.
In designing a solution, several obstacles had to be overcome.
First, the application involved a very low differential pressure span impossible for most remote seal transmitters to measure.

A second challenge was the location of the application, which was subject to considerable swings in ambient temperature.

Such inconsistent conditions can often pose a potential problem when measuring very small pressure differentials.

To solve this problem, the entire transmitter, with remote seals connected, would have to be temperature characterised together in an environmental chamber.
A microprocessor-based ABB 2600T draft range differential pressure transmitter was installed because of its small upper range limit, suitable for the close requirements of the application.

The temperature characterisation data from the environmental chamber was stored in the transmitter's memory.

The transmitter's onboard temperature sensors monitor the ambient temperature.
Accurate pressure measurement is ensured by the transmitter's microprocessor, which compares the data from the environmental chamber with the ambient temperature conditions and adjusts the transmitter's output accordingly.

A major concern at the outset was the risk of any pressure imbalance inside the capillary system due to changes in ambient temperature, which would cause the fill fluid to expand or contract.

The effect of this potential change was calculated under laboratory conditions, with the uncertainty of the system being predicted to be less than 0.5% of span.
Since this new interface level transmitter was installed, the interface level control has greatly improved.

The customer has also reported that downtime has been eliminated, saving over GBP 30,000 per year on the cost of maintenance alone.

Before this, monthly maintenance was required to clean the previously installed buoyancy transmitter system to prevent shutdowns.
Despite this, the instruments would frequently foul up anyway, resulting in the process being shut down.

Selecting the right solution for an interface level measurement application requires consideration of many factors, including accuracy, aggressiveness of the application media and the level of maintenance deemed acceptable for the application.

Opting for a remote seal differential transmitter system provides an ideal solution for aggressive applications and can help to eliminate maintenance whilst delivering greatly enhanced measurement accuracy.

Displacer transmitters in the hydrocarbon industry

The displacer transmitter for liquid level measurement is based on Archimedes principle, that the buoyancy force exerted on a body immersed in a liquid is equal to the weight of the liquid displaced

Archimedes' principle states that the buoyancy force exerted on a body immersed in a liquid is equal to the weight of the liquid displaced This is the principle on which the displacer transmitter for liquid level measurement is based If the cross sectional area of the displacer and the density of the liquid are constant, then a change in level brings about a corresponding change in the apparent weight of the displacer.

Displacer transmitters have provided highly reliable level measurement in difficult hydrocarbon applications for many years.
The measurement technology is simple, reliable, accurate and adaptable to a wide range of needs, including the measurement of an interface between two immiscible liquids.

Importantly for hydrocarbon applications it can be used at very high temperatures and pressures, when most other technologies fail.
There are two types of displacer transmitter in common use today; torque tube and spring operated.

Both have a cylindrical displacer element of a length corresponding to the range of the level measurement required and weighted to sink in the liquid being measured.
In both the maximum change in effective weight of the displacer element is equivalent to the weight of the liquid displaced when the displacer is completely submerged in the liquid.

It is important to also take into account the effect of the upper fluid which, even if a vapour, will have an effect on the buoyancy force, particularly if the vapour space is at a high pressure.

The difference between the two types of displacer transmitters centres around the mechanics of transmitting the displacer movement because of the buoyancy force from the wetside of the instrument to the dryside where it can be translated into an electronic, or in earlier designs, a pneumatic, signal proportional to the liquid level change.

With a torque tube design, the displacer element is suspended on a knife edge hanger at the end of a cantilever arm, the other end of which is welded to the torque tube.
The torque-tube is a hollow tube welded at one end to the instrument flange which is put in torsion by the weight of the displacer element on the cantilever arm.
A rod, welded to the torque tube at one end but free at its other end, sits inside the torque tube and is thus caused to rotate axially as the torque tube rotates.
When the displacer rises or falls, the corresponding angular displacement of the torque rod is linearly proportional to the displacer movement and therefore to the liquid level.

The knife-edge bearing support minimises friction and a limit stop on the torque arm is used to prevent accidental over-stressing of the torque tube.
With regular maintenance, this type of design is proven to measure reliably.
There is a huge installed base in the hydrocarbon and other industries and the technology is well understood.
It is suitable for use in very high pressures - up to about 17Mpa/170bar (2,465psi) and in process temperatures from ­200C (-328F) to more than 450C (842F).
However, it is a bulky instrument which can be awkward to install.
As a mechanical device with a critical knife edge bearing, it requires constant, careful maintenance to ensure continued accuracy.

And finally, because the design relies on a welded pressure joint at the flange end of the torque tube, regular inspection for signs of fatigue or corrosion is essential.

The newer spring-operated displacement transmitter is a more elegant design that overcomes many of the problems associated with torque-tube devices.
Just as reliable as the torque-tube, it is smaller, lighter and more robust.
In a spring-operated instrument, the change in apparent weight of the displacer is transmitted directly, through a spring or coil from which the displacer weight is hung.

When the displacer rises or falls with changing liquid level the spring will relax or extend accordingly as dictated by the formula: Spring extension (contraction) = Force/Spring Rate.
A core piece is located on top of a rod attached to the spring and is thus caused to rise or fall inside the pressure tube.

A precision linear variable differential transformer (LVDT) is situated outside the pressure tube, totally isolated from the process pressure and vapour.
Movement of the core within the fields of the LVDT causes an imbalance which the instrument electronics detects and is able to convert into a signal proportional to the liquid level.

It should be understood that the spring is a heavy duty coil made from typically three or four mm (0.125ins) gauge specially selected alloy wire.
The coil is always selected such that it is operating at about 10% of its yield stress, ensuring maximum sensitivity to changes in the force on it, without the possibility of over-stressing.

Mechanical stops prevent over extension or coil bound operation.
The best instruments on the market are those with coils made from Nimonic, a nickel alloy which gives the spring a perfectly linear expansion over the full operating temperature range of the instrument giving highly accurate level measurement.
Key advantages of the spring operated transmitter are that it has a much smaller mounting envelope than a torque-tube, it is lighter and much easier to install and does not have critical welds under stress.

However, the operating range is not quite as wide; spring-operated devices are typically suitable for use in pressures up to about 25Mpa/250bar (3,600psi) and in process temperatures from ­260C (-436F) to about 300C (572F), although specifications do vary between manufacturers.
Whichever transmitter technology is chosen for the application, the size and weight of the displacer is crucial, since it determines the relationship between the change in the apparent weight and the liquid level.
The optimum displacer diameter for any one application depends on the density of the process liquids, the process operating conditions, and the level measurement span.
It is important at the ordering stage to give the manufacturer the correct data so that the instrument can be sized correctly and be calibrated to give the correct level reading at the process operating conditions.
The inclusion of powerful microprocessor electronics and digital communications in modern displacer transmitters does however give the user the facility to trim, re-calibrate or re-range the instrument very easily on site.
As mechanical devices, displacer transmitters have traditionally needed regular maintenance, cleaning and checking of the calibration.
If you are thinking of investing in this sort of instrumentation then it is worth looking into this aspect thoroughly, since some instruments need significantly more work than others.

The best spring-operated displacement transmitters offer very stable operation with long maintenance intervals, while the maintenance investment required with some torque-tube instruments may be considerably higher.

Might TDR radar be an attractive alternative? In the last two to three years, time domain reflectometer (TDR) radar has been put forward as an alternative to mechanical displacer transmitters for level measurement in difficult applications.
TDR radar makes its measurement by sending a radar signal down a guide rod or wire and monitoring the time taken for a portion of the transmitted microwave energy to be reflected from the liquid/air interface.

The position of the liquid level surface is identified because the change in dielectric which occurs in the transmission line at that point causes reflections.
The time it takes for the reflections to get back to the receiver provides an indication of the distance between the transmitter and the surface of the liquid in the tank.

With no moving parts, this technology is attractive in clean non-viscous liquids because it requires much less maintenance than mechanical devices.
It is starting to be used in hydrocarbon applications and is said to be capable of operating in conditions up to 200C (392F) and 34Mpa/345bar (5,000psi) but because it has not yet established a track record its long term reliability in difficult level measurement applications is not yet proven.

Displacement transmitter technology is a tried and trusted method of level measurement for high temperature and high pressure environments.
The evolution from torque-tube to spring-operated instruments and the addition of high accuracy LVDTs, sophisticated electronics and digital communication options has made significant gains in functionality and operability in the field.
The introduction of TDR radar as a non-mechanical alternative suitable for all but the harshest applications offers the possibility of reliable measurement with much lower cost of ownership.

Although it is starting to make an impact on the market, TDR radar has a long way to go to catch up with the huge installed base of displacer level transmitters in demanding hydrocarbon processing applications on and off-shore.

May 4, 2008

INSTRUMENTATION

Instrumentation is an electrical or pneumatic device placed in the field to provide measurement and/or control capabilities for the system.

The simplest measurement instrumentation device is a thermistor. A thermistor is very similar to a typical resistor, except that it greatly varies its resistance depending on its temperature. Therefore this device can easily be used for measurement of temperature in the field. Other temperature-sensitive devices include RTDs, which also change resistance depending on temperature, and thermocouples, which produce a varying voltage when subjected to heat.

Control instrumentation includes devices such as solenoids, Electrically Operated Valves, breakers, relays, etc. These devices are able to change a field parameter, and provide remote control capabilities.

Transmitters are devices which produce an analog signal, usually in the form of a 4-20 mA electrical current signal, although many other options are possible using voltage, frequency, or pressure. This signal can be used to directly control other instruments, or sent to a PLC, DCS, SCADA system or other type of computerized controller, where it can be interpreted into readable values, or used to control other devices and processes in the system.

Instrumentation plays a significant role in both gathering information from the field and changing the field parameters, and as such are a key part of control loops.

About this blog




Electronic Maintenance

This comprehensive INVOLVE® multimedia training program was produced in association with the Instrument Society of America (ISA). This five lesson program trains participants in Me maintenance of electronic instruments, including pressure, temperature, flow, level, and weight transmitters as well as transducers, recorders, annunciators, and analog electronic controllers.

  • Pressure and Temperature Transmitters

  • Flow Transmitters

  • Level and Weight Transmitters

  • Transducers, Recorders, and Annunciators

  • Electronic Controllers

  • Pressure and Temperature Transmitters



Description:

This lesson introduces electronic transmitter maintenance focusing on pressure and temperature transmitters. The lesson describes the components of a typical pressure or temperature transmitter, their functions, adjustments, inspections, and repairs. Procedures for isolating the faulty component in a transmitter are also demonstrated.
Objectives:



  • Identify components in a disassembled electronic differential pressure or electronic temperature transmitter

  • Test the power supply for the transmitter in an electronic pressure transmitter

  • Adjust the sensor zero and replace the electronics module of the pressure transmitter

  • Isolate malfunctions to either the sensor or circuitry portion of a differential pressure transmitter

  • Verify that a sensor is properly grounded
    Swap circuit boards in a differential pressure transmitter

  • Replace the sensor assembly of a DP transmitter

  • Identify the faulty component in a thermocouple transmitter

  • Test outputs and repair an RTD
    Swap a defective board to calibrate a malfunctioning RTD



Flow Transmitters

Description:

This lesson introduces the inspection and repair of electronic flowmeters by demonstrating maintenance procedures for vortex shedding, turbine, magnetic, and mass electronic flowmeters. The lesson describes typical flow transmitter components, their functions, common malfunctions, and procedures for isolating a faulty component.

  • Objectives:

  • Test and replace the amplifier unit, sensor, and bluff body of a vortex shedding flowmeter

  • Test and replace

  • The preamplifier unit, coil and other necessary components of a turbine flowmeter

  • Test and replace the coil, electrodes, and circuit board in a magnetic flowmeter

  • Jumper the appropriate terminals to simulate zero output and check the flowmeter output in an installed mass flowmeter

  • Test and replace the sensor and circuit boards in an installed mass flowmeter






Level and Weight Transmitters

Description:

This lesson describes the operation, applications, and maintenance of ultrasonic, capacitance, conductivity, and radiation level detectors. The lesson also explains the functions and operation of weighing systems.



Objectives:

Describe the applications and operation of ultrasonic level detectors and their use in both point and continuous measurement applications

Troubleshoot and maintain ultrasonic level detection systems

Describe the applications and operation of radiation level detectors and their use in both point and continuous measurement applications

Explain the safety considerations when maintaining radiation level detectors

Describe the applications and operation of capacitance and conductivity level detectors in both point and continuous measurement applications

Recognize safety considerations for the use of level probes with flammable and/or explosive materials

Identify the maintenance procedures for capacitance level detection systems

Describe the applications and operation of a strain gage load cell as well as considerations for load cell calibration

Describe the applications and operation of a belt conveyor scale as well as how to test and calibrate it



Transducers, Recorders, and Annunciators

Description:

This lesson teaches routine maintenance requirements and calibration procedures for transducers, recorders, and annunciators. The lesson provides a basic understanding of the functions of I/P, P/I, and E/I transducers, multipen and multipoint recorders, and annunciators. The lesson also outlines how to identify and troubleshoot problems in these instruments.



Objectives:

Identify and describe the function of electronic transducers

Identify how I/P transducers work

Identify troubleshooting steps for pneumatic and electronic function on I/P transducers

Identify the steps for continuity tests on I/P transducers

Identify coil replacement steps for I/P transducers

Identify calibration steps for P/I and E/I transducers, identify motor replacement steps for the chart drive on a multipen recorder

Identify the function of drive gears on a multipen recorder and how to clean them

Identify installation steps for a new drive cable on a multipoint recorder and check for proper operation

Identify the function of drive wire resistors on multipoint recorders and how to clean and inspect them

Calibrate multipen and multipoint recorders

Define the function of annunciators and troubleshoot them



Electronic Controllers

Description:

This lesson presents routine maintenance requirements and calibration procedures for electronic controllers. The lesson shows how controller circuitry works and how to adjust and calibrate each of its component sections: the display, the alarm circuitry, and the control circuitry.



Objectives:

Identify the features and functions of controllers

Describe and compare pneumatic and electronic controllers

Identify the signal path through a control circuit

Describe the function of resistors, comparators, proportional band amplifiers, integral amplifiers, differentiating amplifiers, summing amplifiers, and the transducer

Visually identify indicators on electronic controllers as well as set point, process, output, and alarms

Visually identify controls on electronic controllers as well as set point control, auto-manual selector swITC Learningh, and manual/valve control

Identify appropriate equipment and demonstrate procedures for calibrating and troubleshooting display indicators

Identify appropriate test points and demonstrate procedures for calibrating and troubleshooting alarm indicators

Identify appropriate equipment for calibrating control circuits and calibrate proportional, integral, and derivative zero on the control circuit

Identify appropriate equipment for troubleshooting control circuits


Electronic Maintenance

This comprehensive INVOLVE® multimedia training program was produced in association with the Instrument Society of America (ISA). This five lesson program trains participants in Me maintenance of electronic instruments, including pressure, temperature, flow, level, and weight transmitters as well as transducers, recorders, annunciators, and analog electronic controllers.

Pressure and Temperature Transmitters

Flow Transmitters

Level and Weight Transmitters

Transducers, Recorders, and Annunciators

Electronic Controllers

Pressure and Temperature Transmitters

Description:

This lesson introduces electronic transmitter maintenance focusing on pressure and temperature transmitters. The lesson describes the components of a typical pressure or temperature transmitter, their functions, adjustments, inspections, and repairs. Procedures for isolating the faulty component in a transmitter are also demonstrated.

Objectives:

Identify components in a disassembled electronic differential pressure or electronic temperature transmitter

Test the power supply for the transmitter in an electronic pressure transmitter

Adjust the sensor zero and replace the electronics module of the pressure transmitter

Isolate malfunctions to either the sensor or circuitry portion of a differential pressure transmitter

Verify that a sensor is properly grounded

Swap circuit boards in a differential pressure transmitter

Replace the sensor assembly of a DP transmitter

Identify the faulty component in a thermocouple transmitter

Test outputs and repair an RTD

Swap a defective board to calibrate a malfunctioning RTD




Flow Transmitters

Description:

This lesson introduces the inspection and repair of electronic flowmeters by demonstrating maintenance procedures for vortex shedding, turbine, magnetic, and mass electronic flowmeters. The lesson describes typical flow transmitter components, their functions, common malfunctions, and procedures for isolating a faulty component.



Objectives:

Test and replace the amplifier unit, sensor, and bluff body of a vortex shedding flowmeter

Test and replace

The preamplifier unit, coil and other necessary components of a turbine flowmeter

Test and replace the coil, electrodes, and circuit board in a magnetic flowmeter

Jumper the appropriate terminals to simulate zero output and check the flowmeter output in an installed mass flowmeter

Test and replace the sensor and circuit boards in an installed mass flowmeter




Level and Weight Transmitters

Description:

This lesson describes the operation, applications, and maintenance of ultrasonic, capacitance, conductivity, and radiation level detectors. The lesson also explains the functions and operation of weighing systems. Objectives:

Describe the applications and operation of ultrasonic level detectors and their use in both point and continuous measurement applications

Troubleshoot and maintain ultrasonic level detection systems

Describe the applications and operation of radiation level detectors and their use in both point and continuous measurement applications

Explain the safety considerations when maintaining radiation level detectors

Describe the applications and operation of capacitance and conductivity level detectors in both point and continuous measurement applications

Recognize safety considerations for the use of level probes with flammable and/or explosive materials

Identify the maintenance procedures for capacitance level detection systems

Describe the applications and operation of a strain gage load cell as well as considerations for load cell calibration

Describe the applications and operation of a belt conveyor scale as well as how to test and calibrate it




Transducers, Recorders, and Annunciators

Description:

This lesson teaches routine maintenance requirements and calibration procedures for transducers, recorders, and annunciators. The lesson provides a basic understanding of the functions of I/P, P/I, and E/I transducers, multipen and multipoint recorders, and annunciators. The lesson also outlines how to identify and troubleshoot problems in these instruments.



Objectives:

Identify and describe the function of electronic transducers

Identify how I/P transducers work

Identify troubleshooting steps for pneumatic and electronic function on I/P transducers

Identify the steps for continuity tests on I/P transducers

Identify coil replacement steps for I/P transducers

Identify calibration steps for P/I and E/I transducers, identify motor replacement steps for the chart drive on a multipen recorder

Identify the function of drive gears on a multipen recorder and how to clean them

Identify installation steps for a new drive cable on a multipoint recorder and check for proper operation

Identify the function of drive wire resistors on multipoint recorders and how to clean and inspect them

Calibrate multipen and multipoint recorders

Define the function of annunciators and troubleshoot them



Electronic Controllers

Description:

This lesson presents routine maintenance requirements and calibration procedures for electronic controllers. The lesson shows how controller circuitry works and how to adjust and calibrate each of its component sections: the display, the alarm circuitry, and the control circuitry.
Objectives:

Identify the features and functions of controllers

Describe and compare pneumatic and electronic controllers

Identify the signal path through a control circuit

Describe the function of resistors, comparators, proportional band amplifiers, integral amplifiers, differentiating amplifiers, summing amplifiers, and the transducer

Visually identify indicators on electronic controllers as well as set point, process, output, and alarms

Visually identify controls on electronic controllers as well as set point control, auto-manual selector swITC Learningh, and manual/valve control

Identify appropriate equipment and demonstrate procedures for calibrating and troubleshooting display indicators

Identify appropriate test points and demonstrate procedures for calibrating and troubleshooting alarm indicators

Identify appropriate equipment for calibrating control circuits and calibrate proportional, integral, and derivative zero on the control circuit

Identify appropriate equipment for troubleshooting control circuits



 Copyright © 2008 by carthworks.com

Pressure Measurement

Pressure measurement
It has been suggested that Difference between gauge and absolute pressure be merged into this article or section. (Discuss) The construction of a bourdon tube gauge, construction elements are made of brass Many techniques have been developed for the measurement of pressure and vacuum. Instruments used to measure pressure are called pressure gauges or vacuum gauges. A manometer could also be referring to a pressure measuring instrument, usually limited to measuring pressures near to atmospheric.

The term manometer is often used to refer specifically to liquid column hydrostatic instruments. A vacuum gauge is used to measure the pressure in a vacuum --- which is further divided into two subcategories: high and low vacuum (and sometimes ultra-high vacuum). The applicable pressure range of many of the techniques used to measure vacuums have an overlap. Hence, by combining several different types of gauge, it is possible to measure system pressure continuously from 10 mbar down to 10-11 mbar. [1] Contents * 1 Zero reference * 2 Units * 3 Dynamic pressure * 4 Applications * 5 Instruments o 5.1 Hydrostatic + 5.1.1 Piston + 5.1.2 Liquid column + 5.1.3 McLeod gauge o 5.2 Aneroid + 5.2.1 Bourdon # 5.2.1.1 Mechanical details + 5.2.2 Diaphragm + 5.2.3 Bellows + 5.2.4 Secondary transducer o 5.3 Thermal conductivity + 5.3.1 Two wire + 5.3.2 Pirani (one wire) o 5.4 Ionization gauge + 5.4.1 Hot cathode + 5.4.2 Cold cathode * 6 Calibration * 7 Dynamic transients * 8 History * 9 European (CEN) Standard * 10 See also * 11 Patents * 12 External links * 13 References

Zero reference


Although pressure is an absolute quantity, everyday pressure measurements, such as for tire pressure, are usually made relative to ambient air pressure. In other cases measurements are made relative to a vacuum or to some other ad hoc reference. When distinguishing between these zero references, the following terms are used: * Absolute pressure is zero referenced against a perfect vacuum, so it is equal to gauge pressure plus atmospheric pressure. * Gauge pressure is zero referenced against ambient air pressure, so it is equal to absolute pressure minus atmospheric pressure. Negative signs are usually omitted. * Differential pressure is the difference in pressure between two points.
The zero reference in use is usually implied by context, and these words are only added when clarification is needed. Tire pressure and blood pressure are gauge pressures by convention, while atmospheric pressures, deep vacuum pressures, and altimeter pressures must be absolute. Differential pressures are commonly used in industrial process systems. Differential pressure gauges have two inlet ports, each connected to one of the volumes whose pressure is to be monitored. In effect, such a gauge performs the mathematical operation of subtraction through mechanical means, obviating the need for an operator or control system to watch two separate gauges and determine the difference in readings. Moderate vacuum pressures are often ambiguous, as they may represent absolute pressure or gauge pressure without a negative sign. Thus a vacuum of 26 inHg gauge is equivalent to an absolute pressure of 30 inHg (typical atmospheric pressure) − 26 inHg = 4 inHg. Atmospheric pressure is typically about 100 kPa at sea level, but is variable with altitude and weather. If the absolute pressure of a fluid stays constant, the gauge pressure of the same fluid will vary as atmospheric pressure changes.

For example, when a car drives up a mountain, the tire pressure goes up. Some standard values of atmospheric pressure such as 101.325 kPa or 100 kPa have been defined, and some instruments use one of these standard values as a constant zero reference instead of the actual variable ambient air pressure. This impairs the accuracy of these instruments, especially when used at high altitudes.

Units


Pressure Units pascal (Pa) bar (bar) technical atmosphere (at) atmosphere (atm) torr (Torr) pound-force per square inch (psi) 1 Pa ≡ 1 N/m2 10−5 1.0197×10−5 9.8692×10−6 7.5006×10−3 145.04×10−6 1 bar 100,000 ≡ 106 dyn/cm2 1.0197 0.98692 750.06 14.5037744 1 at 98,066.5 0.980665 ≡ 1 kgf/cm2 0.96784 735.56 14.223 1 atm 101,325 1.01325 1.0332 ≡ 1 atm 760 14.696 1 torr 133.322 1.3332×10−3 1.3595×10−3 1.3158×10−3 ≡ 1 Torr; ≈ 1 mmHg 19.337×10−3 1 psi 6,894.76 68.948×10−3 70.307×10−3 68.046×10−3 51.715 ≡ 1 lbf/in2 Example reading: 1 Pa = 1 N/m2 = 10−5 bar = 10.197×10−6 at = 9.8692×10−6 atm, etc. Note: mmHg is an abbreviation for millimetres of mercury. The SI unit for pressure is the pascal (Pa), equal to one newton per square metre (N·m-2 or kg·m-1·s-2). This special name for the unit was added in 1971; before that, pressure in SI was expressed in units such as N/m². When indicated, the zero reference is stated in parenthesis following the unit, for example 101 kPa (abs).
The Pounds per square inch (psi) is still in widespread use in the US and Canada, notably for cars. A letter is often appended to the psi unit to indicate the measurement's zero reference; psia for absolute, psig for gauge, psid for differential, although this practice is discouraged by the NIST [1]. Because pressure was once commonly measured by its ability to displace a column of liquid in a manometer, pressures are often expressed as a depth of a particular fluid (e.g. inches of water). The most common choices are mercury (Hg) and water; water is nontoxic and readily available, while mercury's density allows for a shorter column (and so a smaller manometer) to measure a given pressure. Fluid density and local gravity can vary from one reading to another depending on local factors, so the height of a fluid column does not define pressure precisely. When 'millimetres of mercury' or 'inches of mercury' are quoted today, these units are not based on a physical column of mercury; rather, they have been given precise definitions that can be expressed in terms of SI units. The water-based units usually assume one of the older definitions of the kilogram as the weight of a litre of water. Although no longer favoured by measurement experts, these manometric units are still encountered in many fields.

Blood pressure is measured in millimetres of mercury in most of the world, and lung pressures in centimeters of water are still common. Natural gas pipeline pressures are measured in inches of water, expressed as '"WC' ('Water Column'). Scuba divers often use a manometric rule of thumb: the pressure exerted by ten metres depth of water is approximately equal to one atmosphere. In vacuum systems, the units torr, micrometre of mercury (micron), and inch of mercury (inHg) are most commonly used. Torr and micron usually indicates an absolute pressure, while inHg usually indicates a gauge pressure. Atmospheric pressures are usually stated using kilopascal (kPa), or atmospheres (atm), except in American meteorology where the hectopascal (hPa) and millibar (mbar) are preferred. In American and Canadian engineering, stress is often measured in kip. Note that stress is not a true pressure since it is not scalar. In the cgs system the unit of pressure was the barye (ba), equal to 1 dyn·cm-2. In the mts system, the unit of pressure was the pieze, equal to 1 sthene per square metre. Many other hybrid units are used such as mmHg/cm² or grams-force/cm² (sometimes as kg/cm² and g/mol2 without properly identifying the force units). Using the names kilogram, gram, kilogram-force, or gram-force (or their symbols) as a unit of force is forbidden in SI; the unit of force in SI is the newton .

Dynamic pressure
Static pressure is uniform in all directions, so pressure measurements are independent of direction in an immobile (static) fluid. Flow, however, applies additional pressure on surfaces perpendicular to the flow direction, while having little impact on surfaces parallel to the flow direction. This directional component of pressure in a moving (dynamic) fluid is called dynamic pressure. An instrument facing the flow direction measures the sum of the static and dynamic pressures; this measurement is called the total pressure or stagnation pressure.
Since dynamic pressure is referenced to static pressure, it is neither gauge nor absolute; it is a differential pressure.

While static gauge pressure is of primary importance to determining net loads on pipe walls, dynamic pressure is used to measure flow rates and airspeed. Dynamic pressure can be measured by taking the differential pressure between instruments parallel and perpendicular to the flow. Pitot-static tubes, for example perform this measurement on airplanes to determine airspeed. The presence of the measuring instrument inevitably acts to divert flow and create turbulence, so its shape is critical to accuracy and the calibration curves are often non-linear.

Applications
* Sphygmomanometer * Barometer * Altimeter * Pitot tube * MAP sensor

Instruments
Many instruments have been invented to measure pressure, with different advantages and disadvantages. Pressure range, sensitivity, dynamic response and cost all vary by several orders of magnitude from one instrument design to the next. The oldest type is the liquid column (a vertical tube filled with mercury) manometer invented by Evangelista Torricelli in 1643. The U-Tube was invented by Christian Huygens in 1661. Hydrostatic Hydrostatic gauges (such as the mercury column manometer) compare pressure to the hydrostatic force per unit area at the base of a column of fluid. Hydrostatic gauge measurements are independent of the type of gas being measured, and can be designed to have a very linear calibration.
They have poor dynamic response. Piston Piston-type gauges counterbalance the pressure of a fluid with a solid weight or a spring. For example dead-weight testers used for calibration and Tire-pressure gauges. Liquid column The difference in fluid height in a liquid column manometer is proportional to the pressure difference.
H=\frac{P_a-P_o}{g \rho} Liquid column gauges consist of a vertical column of liquid in a tube whose ends are exposed to different pressures. The column will rise or fall until its weight is in equilibrium with the pressure differential between the two ends of the tube. A very simple version is a U-shaped tube half-full of liquid, one side of which is connected to the region of interest while the reference pressure (which might be the atmospheric pressure or a vacuum) is applied to the other. The difference in liquid level represents the applied pressure. The pressure exerted by a column of fluid of height h and density ρ is given by the hydrostatic pressure equation, P = hgρ. Therefore the pressure difference between the applied pressure Pa and the reference pressure Po in a U-tube manometer can be found by solving Pa − Po = hgρ.

If the fluid being measured is significantly dense, hydrostatic corrections may have to be made for the height between the moving surface of the manometer working fluid and the location where the pressure measurement is desired. Any fluid can be used, but mercury is preferred for its high density (13.534 g/cm³) and low vapour pressure. For low pressure differences well above the vapour pressure of water, water is a commonly-used liquid (and "inches of water" is a commonly-used pressure unit). Liquid column pressure gauges are independent of the type of gas being measured and have a highly linear calibration. They have poor dynamic response. When measuring vacuum, the working liquid may evaporate and contaminate the vacuum if its vapor pressure is too high. When measuring liquid pressure, a loop filled with gas or a light fluid must isolate the liquids to prevent them from mixing. Simple hydrostatic gauges can measure pressures ranging from a few Torr (a few 100 Pa) to a few atmospheres. (Approximately 1,000,000 Pa) A single-limb liquid-column manometer has a larger reservoir instead of one side of the U-tube and has a scale beside the narrower column. The column may be inclined to further amplify the liquid movement. Based on the use and structure following type of manometers are used[2] 1. Simple Manometer 2. Micromanometer 3. Differential manometer 4. Inverted differential manometer A McLeod gauge, drained of mercury

McLeod gauge
A McLeod gauge isolates a sample of gas and compresses it in a modified mercury manometer until the pressure is a few mmHg. The gas must be well-behaved during its compression (it must not condense, for example). The technique is slow and unsuited to continual monitoring, but is capable of good accuracy. Useful range: above 10-4 torr [3] (roughly 10-2 Pa) An important variation is the McLeod gauge which isolates a known volume of vacuum and compresses it to multiply the height variation of the liquid column. The McLeod gauge can measure vacuums as high as 10−6 Torr (0.1 mPa), which is the lowest direct measurement of pressure that is possible with current technology. Other vacuum gauges can measure lower pressures, but only indirectly by measurement of other pressure-controlled properties. These indirect measurements must be calibrated to SI units via a direct measurement, most commonly a McLeod gauge.[4] Aneroid Aneroid gauges are based on a metallic pressure sensing element which flexes elastically under the effect of a pressure difference across the element. "Aneroid" means "without fluid," and the term originally distinguished these gauges from the hydrostatic gauges described above.

However, aneroid gauges can be used to measure the pressure of a liquid as well as a gas, and they are not the only type of gauge that can operate without fluid. For this reason, they are often called mechanical gauges in modern language. Aneroid gauges are not dependent on the type of gas being measured, unlike thermal and ionization gauges, and are less likely to contaminate the system than hydrostatic gauges. The pressure sensing element may be a Bourdon tube, a diaphragm, a capsule, or a set of bellows, which will change shape in response to the pressure of the region in question. The deflection of the pressure sensing element may be read by a linkage connected to a needle, or it may be read by a secondary transducer. The most common secondary transducers in modern vacuum gauges measure a change in capacitance due to the mechanical deflection. Gauges that rely on a change in capacitances are often referred to as Baratron gauges.

Bourdon Membrane-type manometer A Bourdon gauge uses a coiled tube, which, as it expands due to pressure increase causes a rotation of an arm connected to the tube. A combination pressure and vacuum gauge (case and viewing glass removed) Indicator Side with card and dial Mechanical Side with Bourdon tube In 1849 the Bourdon tube pressure gauge was patented in France by Eugene Bourdon. The pressure sensing element is a closed coiled tube connected to the chamber or pipe in which pressure is to be sensed. As the gauge pressure increases the tube will tend to uncoil, while a reduced gauge pressure will cause the tube to coil more tightly. This motion is transferred through a linkage to a gear train connected to an indicating needle. The needle is presented in front of a card face inscribed with the pressure indications associated with particular needle deflections.
In a barometer, the Bourdon tube is sealed at both ends and the absolute pressure of the ambient atmosphere is sensed. Differential Bourdon gauges use two Bourdon tubes and a mechanical linkage that compares the readings. In the following pictures the transparent cover face has been removed and the mechanism removed from the case. This particular gauge is a combination vacuum and pressure gauge used for automotive diagnosis: * the left side of the face, used for measuring manifold vacuum, is calibrated in centimetres of mercury on its inner scale and inches of mercury on its outer scale. * the right portion of the face is used to measure fuel pump pressure and is calibrated in fractions of 1 kgf/cm² on its inner scale and pounds per square inch on its outer scale. [edit] Mechanical details Mechanical Details
Stationary parts:

* A: Receiver block. This joins the inlet pipe to the fixed end of the Bourdon tube (1) and secures the chassis plate (B). The two holes receive screws that secure the case. * B: Chassis Plate. The face card is attached to this. It contains bearing holes for the axles. * C: Secondary Chassis Plate. It supports the outer ends of the axles. * D: Posts to join and space the two chassis plates.

Moving Parts
:
1. Stationary end of Bourdon tube. This communicates with the inlet pipe through the receiver block. 2. Moving end of Bourdon tube. This end is sealed. 3. Pivot and pivot pin. 4. Link joining pivot pin to lever (5) with pins to allow joint rotation. 5. Lever. This an extension of the sector gear (7). 6. Sector gear axle pin. 7. Sector gear. 8. Indicator needle axle. This has a spur gear that engages the sector gear (7) and extends through the face to drive the indicator needle. Due to the short distance between the lever arm link boss and the pivot pin and the difference between the effective radius of the sector gear and that of the spur gear, any motion of the Bourdon tube is greatly amplified. A small motion of the tube results in a large motion of the indicator needle. 9. Hair spring to preload the gear train to eliminate gear lash and hysteresis.

Diaphragm
A pile of pressure capsules with corrugated diaphragms in an aneroid barograph. A second type of aneroid gauge uses the deflection of a flexible membrane that separates regions of different pressure. The amount of deflection is repeatable for known pressures so the pressure can be determined by using calibration. The deformation of a thin diaphragm is dependent on the difference in pressure between its two faces. The reference face can be open to atmosphere to measure gauge pressure, open to a second port to measure differential pressure, or can be sealed against a vacuum or other fixed reference pressure to measure absolute pressure. The deformation can be measured using mechanical, optical or capacitive techniques. Ceramic and metallic diaphragms are used. Useful range: above 10-2 Torr [5] (roughly 1 Pa) For absolute measurements, welded pressure capsules with diaphragms on either side are often used.

Shape: * Flat * corrugated * flattened tube * capsule

Bellows In gauges intended to sense small pressures or pressure differences, or require that an absolute pressure be measured, the gear train and needle may be driven by an enclosed and sealed bellows chamber, called an aneroid, which means "without liquid". (Early barometers used a column of liquid such as water or the liquid metal mercury suspended by a vacuum.) This bellows configuration is used in aneroid barometers (barometers with an indicating needle and dial card), altimeters, altitude recording barographs, and the altitude telemetry instruments used in weather balloon radiosondes.


These devices use the sealed chamber as a reference pressure and are driven by the external pressure. Other sensitive aircraft instruments such as air speed indicators and rate of climb indicators (variometers) have connections both to the internal part of the aneroid chamber and to an external enclosing chamber. Secondary transducer * resistive (strain gauge) * inductive * capacitive - The deflection of the piston is often one half of a capacitor, so that when the piston moves, the capacitance of the device changes.
This is a common way (with proper calibrations) to get a very precise, electronic reading from a manometer, and this configuration is called a capacitive manometer vacuum gauge. This is also called a capacitance manometer, in which the diaphragm makes up a part of a capacitor. A change in pressure leads to the flexure of the diaphragm, which results in a change in capacitance. These gauges are effective from 10−3 Torr to 10−4 Torr. * piezoelectric/piezoresistive

Thermal conductivity
Thermal Conductivity gauges rely on the fact that the ability of a gas to conduct heat decreases with pressure. In this type of gauge, a wire filament is heated by running current through it. A thermocouple or Resistance Temperature Detector (RTD) can then be used to measure the temperature of the filament. This temperature is dependent on the rate at which the filament loses heat to the surrounding gas, and therefore on the thermal conductivity. A common variant is the Pirani gauge which uses a single platinum filament as both the heated element and RTD. These gauges are accurate from 10 Torr to 10−3 Torr, but they are sensitive to the chemical composition of the gases being measured. Two wire One wire coil is used as a heater, and the other is used to measure nearby temperature due to convection. Pirani (one wire) A Pirani gauge consists of a metal wire open to the pressure being measured. The wire is heated by a current flowing through it and cooled by the gas surrounding it. If the gas pressure is reduced, the cooling effect will decrease, hence the equilibrium temperature of the wire will increase. The resistance of the wire is a function of its temperature: by measuring the voltage across the wire and the current flowing through it, the resistance (and so the gas pressure) can be determined.

This type of gauge was invented by Marcello Pirani. Thermocouple gauges and thermistor gauges work in a similar manner, except a thermocouple or thermistor is used to measure the temperature of the wire. Useful range: 10-3 - 10 Torr [6] (roughly 10-1 - 1000 Pa) Ionization gauge Ionization gauges are the most sensitive gauges for very low pressures (high vacuums, AKA "hard" vacuums). They sense pressure indirectly by measuring the electrical ions produced when the gas is bombarded with electrons.

Fewer ions will be produced by lower density gases. The calibration of an ion gauge is unstable and dependent on the nature of the gases being measured, which is not always known. They can be calibrated against a McLeod gauge which is much more stable and independent of chemistry. Thermionic emission generate electrons, which collide with gas atoms and generate positive ions. The ions are attracted to a suitably biased electrode known as the collector. The current in the collector is proportional to the rate of ionization, which is a function of the pressure in the system. Hence, measuring the collector current gives the gas pressure.

There are several sub-types of ionization gauge. Useful range: 10-10 - 10-3 torr (roughly 10-8 - 10-1 Pa) Most ion gauges come in two types: hot cathode and cold cathode, a third type exists which is more sensitive and expensive known as a spinning rotor gauge, but is not discussed here. In the hot cathode version an electrically heated filament produces an electron beam.

The electrons travel through the gauge and ionize gas molecules around them. The resulting ions are collected at a negative electrode. The current depends on the number of ions, which depends on the pressure in the gauge. Hot cathode gauges are accurate from 10−3 Torr to 10−10 Torr. The principle behind cold cathode version is the same, except that electrons are produced in a discharge created by a high voltage electrical discharge. Cold Cathode gauges are accurate from 10−2 Torr to 10−9 Torr. Ionization gauge calibration is very sensitive to construction geometry, chemical composition of gases being measured, corrosion and surface deposits. Their calibration can be invalidated by activation at atmospheric pressure or low vacuum. The composition of gases at high vacuums will usually be unpredictable, so a mass spectrometer must be used in conjunction with the ionization gauge for accurate measurement.[7]

Hot cathode Bayard-Alpert hot cathode ionization gauge A hot cathode ionization gauge is mainly composed of three electrodes all acting as a triode, where the cathode is the filament. The three electrodes are a collector or plate, a filament, and a grid. The collector current is measured in picoamps by an electrometer. The filament voltage to ground is usually at a potential of 30 volts while the grid voltage at 180–210 volts DC, unless there is an optional electron bombardment feature, by heating the grid which may have a high potential of approximately 565 volts.

The most common ion gauge is the hot cathode Bayard-Alpert gauge, with a small ion collector inside the grid. A glass envelope with an opening to the vacuum can surround the electrodes, but usually the Nude Gauge is inserted in the vacuum chamber directly, the pins being fed through a ceramic plate in the wall of the chamber. Hot cathode gauges can be damaged or lose their calibration if they are exposed to atmospheric pressure or even low vacuum while hot. The measurements of a hot cathode ionization gauge are always logarithmic. Electrons emitted from the filament move several times in back and forth movements around the grid before finally entering the grid. During these movements, some electrons collide with a gaseous molecule to form a pair of an ion and an electron (Electron ionization).

The number of these ions is proportional to the gaseous molecule density multiplied by the electron current emitted from the filament, and these ions pour into the collector to form an ion current. Since the gaseous molecule density is proportional to the pressure, the pressure is estimated by measuring the ion current. The low pressure sensitivity of hot cathode gauges is limited by the photoelectric effect. Electrons hitting the grid produce x-rays that produce photoelectric noise in the ion collector. This limits the range of older hot cathode gauges to 10-8 Torr and the Bayard-Alpert to about 10-10 Torr.

Additional wires at cathode potential in the line of sight between the ion collector and the grid prevent this effect. In the extraction type the ions are not attracted by a wire, but by an open cone. As the ions cannot decide which part of the cone to hit, they pass through the hole and form an ion beam. This ion beam can be passed on to a * Faraday cup * Microchannel plate detector with Faraday cup * Quadrupole mass analyzer with Faraday cup * Quadrupole mass analyzer with Microchannel plate detector Faraday cup * ion lens and acceleration voltage and directed at a target to form a sputter gun. In this case a valve lets gas into the grid-cage. See also: Electron ionization

Cold cathode There are two subtypes of cold cathode ionization gauges: the Penning gauge (invented by Frans Michel Penning), and the Inverted magnetron, also called a Redhead gauge. The major difference between the two is the position of the anode with respect to the cathode. Neither has a filament, and each may require a DC potential of about 4 kV for operation. Inverted magnetrons can measure down to 1x10-12 Torr. Such gauges cannot operate if the ions generated by the cathode recombine before reaching the anodes. If the mean-free path of the gas within the gauge is smaller than the gauge's dimensions, then the electrode current will essentially vanish. A practical upper-bound to the detectable pressure is, for a Penning gauge, of the order of 10-3 Torr.

Similarly, cold cathode gauges may be reluctant to start at very low pressures, in that the near-absence of a gas makes it difficult to establish an electrode current - particularly in Penning gauges which use an axially symmetric magnetic field to create path lengths for ions which are of the order of metres. In ambient air suitable ion-pairs are ubiquitously formed by cosmic radiation; in a Penning gauge design features are used to ease the set-up of a discharge path. For example, the electrode of a Penning gauge is usually finely tapered to facilitate the field emission of electrons.

Maintenance cycles of cold cathode gauges is generally measured in years, depending on the gas type and pressure that they are operated in. Using a cold cathode gauge in gases with substantial organic components, such as pump oil fractions, can result in the growth of delicate carbon films and shards within the gauge which eventually either short-circuit the electrodes of the gauge, or impede the generation of a discharge path.

KBS Consultants Jobs News

Control.com - Nerds in Control

engineeringnews.co.za - Instrumentation and Control

engineers instrumentation jobs in India | recruit.net India

contact us

My photo
Coimbatore, Tamilnadu, India
I am Santhosh, Engineering student passion on internet Technologies, working on the direction to become an entrepreneur