Saturday, 14 September 2013

CRITICALITY

CRITICALITY is a measure of the frequency of occurrence of an effect.
                       – May be based on qualitative judgement or
                       – May be based on failure rate data (most common)
01-static criticality experiment facility-uranium enriched solution
Qualitative analysis:
        –Used when specific part or item failure rates are not available.
Quantitative analysis:
        –Used when sufficient failure rate data is available to calculate criticality numbers.
01-qualitative vs quantitative analysis-part-item failure rates-calculate criticality number
Qualitative Approach:
  • Because failure rate data is not available, failure mode ratios and failure mode probability are not used.
  • The probability of occurrence of each failure is grouped into discrete levels that establish the qualitative failure probability level for each entry based on the judgment of the analyst.
  • The failure mode probability levels of occurrence are:
–Level A – Frequent
–Level B – Reasonably Probable
          –Level C – Occasional
          –Level D – Remote
          –Level E – Extremely Unlikely
Quantitative Approach
Failure Mode Criticality (CM) is the portion of the criticality number for an item, due to one of its failure modes, which results in a particular severity classification (e.g. results in an end effect with severity I, II, etc…).
  • Category I – Catastrophic: A failure which may cause death or weapon system loss (i.e., aircraft, tank, missile, ship, etc…)
  • Category II – Critical: A failure which may cause severe injury, major property damage, or major system damage which will result in mission loss.
  • Category III – Marginal: A failure which may cause minor injury, minor property damage, or minor system damage which will result in delay or loss of availability or mission degradation.
  • Category IV – Minor: A failure not serious enough to cause injury, property damage or system damage, but which will result in unscheduled maintenance or repair.
The quantitative approach uses the following formula for Failure Mode Criticality:
Cm = βαλpt
Where
            Cm = Failure Mode Criticality
               β = Conditional probability of occurrence of next higher failure effect
              α = Failure mode ratio
            λp = Part failure rate
              T = Duration of applicable mission phase
01-criticality analysis-example-Plutonium criticality cell pit removal 

ENGINEERING METROLOGY

MEASUREMENTS:
01-Imperial-Measurements-CONCEPTS OF MEASUREMENT-BASICS-OUTCOME OF A RESULT-EXAMPLES-MEASURING TAPES-LENGTH-HEIGHT
A Measurement is the outcome of an opinion formed by observers about some physical quantity.
CLASSIFICATION OF MEASUREMENTS:
    • Standards -  ( Reproduce the value of given quantity )
    • Fixed Gauges – (Check Dimensions)
    • Measuring Instruments – (Determine the measured value)
 NEEDS FOR MEASUREMENT:
1. To Determine the true dimensions of a part.
2. To increase our knowledge and understanding of the world.
3. Needed for ensuring public health and human safety.
4. To convert physical parameters into meaningful numbers.
5. To test if the elements that constitute the system function as per the design.
6. For evaluating the performance of a system.
7. For studying some basic laws of nature.
8. To ensure interchangeability with a view to promoting mass production.
9. To evaluate the response of the system to particular point.
10. To check the limitations of theory in actual situations.
11. To establish the validity of design and for finding new data and new designs.
METHODS OF MEASUREMENT:
1. Direct Comparison
2. Indirect Comparison
3. Comparative Method
4. Coincidence Method
5. Fundamental Method
6. Contact Method
7. Transposition Method
8. Complementary Method
9. Deflection Method
Direct Method:
          Measurements are directly obtained.
               Ex:Vernier Caliper,Scales.
01-electroniccaliper-VERNIER CALIPER-DIGITAL VERNIER CALIPER-DIRECT MEASUREMENTS-ACCURATE-PRECISION MEASUREMENTS-CALIBRATED INSTRUMENTS
Indirect Method:
         Obtained by measuring other quantities.
  Ex:Diameter measurement by using three wires.
01-DIAMETER MEASUREMENTS-INDIRECT MEASUREMENTS-CAPACITIVE TRANSDUCER BY WHEATSTONE BRIDGE CIRCUITS
Comparative Method:
        It’s compared with other known value.
             Ex:Comparators.
01-comparator_stand_dial_gauge-precisional measurements-surface finish-height measurements-tolerance measurements
Coincidence Method:
       Measurements coincide with certain lines and signals.
01-coincident methods-coincidence line-simplify lines-measurements lines and signals
Fundamental Method:
       Measuring a quantity directly in related with the definition of that quantity.
Contact Method:
      Sensor/Measuring tip touch the surface area.
             Ex:Vernier Caliper.
01-proximity_sensor-distance measurements-long distance-measuring probe-sensing device
Transposition Method:
Quantity to be measured is first balanced by a known value and then balanced by an other new known value.
Ex:Determination of mass by balancing methods.
01-tool-balancing-balance methods-determination of mass-scale
Complementary Method:
01-volume-measurement-lighter-solids-volume displacement-liquid measure level-liquid displacement
The value of quantity to be measured is combined with known value of the same quantity.
Ex:Volume determination by liquid displacement.
Deflection Method:
The value to be measured is directly indicated by a deflection of pointer.
Ex:Pressure Measurement.
01-pressure measurement-pressure gauges-measurement of pressure-deflection of pointer
TERMS OF MEASUREMENT:
Precision:
The ability of the instrument to reproduce it’s readings or observation again and again for constant input signal.
Accuracy:
Closeness/conformity to the true value of the quantity under measurement.
01-Accuracy-Precision-uncertainity analysis-systematic errors-reproducibility errors
Error:
The difference between true value and measured value is known as measurement error.
Error = Vt – Vm
Reliability:
It is defined as the probability that a given system will perform it’s function adequately for it’s specified period of lifetime under specified operating conditions.
01-reliability-analysis-life time analysis-life time warranty

FMEA | Failure Mode And Effect Analysis

Failure Mode – A particular way in which an item fails, independent of the reason for failure.
 Failure Mode and Effects Analysis (FMEA) – A procedure by which each credible failure mode of each item from a low indenture level to the highest is analyzed to determine the effects on the system and to classify each potential failure mode in accordance with the severity of its effect.
Indenture Levels – The hierarchy of hardware levels from the part to the component to the subsystem to the system, etc.
Redundancy – More than one independent means of performing a function.  There are different kinds of redundancy, including:
            (1) Operational – Redundant items, all of which are energized during the operating cycle; includes load-sharing, wherein redundant items are connected in a manner such that upon failure of one item, the other will continue to perform the function.  It is not necessary to switch out the failed item or switch in the redundant one.
            (2) Standby – Items that are inoperative (have no power applied) until they are switched in upon failure of the primary item.
(3) Like Redundancy – Identical items performing the same function.
            (4) Unlike Redundancy – Non identical items performing the same function
THE FMEA PROCESS
01-web- failure analysis-unexpected failure-operational fracture-failure rate
  • Define the system to be analyzed.  A complete system definition includes identification of internal and interface functions, expected performance at all indenture levels, system restraints, and failure definitions.  Also state systems and mission phases not analyzed giving rationale for the omissions.
  • Indicate the depth of the analysis by identifying the indenture level at which the analysis is begun.
  • Identify specific design requirements that are to be verified by the FMEA.
  • Define ground rules and assumptions on which the analysis is based.  Identify mission phases to be analyzed and the status of equipment during each mission phase.
  • Obtain or construct functional and reliability block diagrams indicating interrelationships of functional groups, system operation, independent data channels, and backup or workaround features of the system.
  • Identify failure modes, effects, failure detection and workaround features and other pertinent information on the worksheet.
  • Evaluate the severity of each failure effect in accordance with the prescribed severity categories.
FMEA Flow Diagram:
01-FMEA FLOW DIAGRAM-STEPS-PREVENTIVE ACTION-CORRECTIVE ACTION
History:
The FMECA was originally developed by the National Aeronautics and Space Administration (NASA) to improve and verify the reliability of space program hardware.
FMECA Flow Diagram: ( Failure Mode, Effects and Criticality Analysis )
01-FMECA Flow Diagram- Failure Mode Effects and Criticality Analysis
Criticality Analysis Flow:
01-quantitative method-qualitative method-analysis-criticality analysis flow diagram
Who is the Team ?

Areas to be represented are:
  • Quality
  • Logistics
  • Engineering
  • Purchasing
  • Manufacturing
  • Sales
  • Tooling
  • Marketing
  • Customer
  • Supplier

Errors in Measurement

Errors in Measurement :
Error = Measured Value – True Value
Em = Vm - Vt
1. Absolute Error :
01-errors in measurement-absolute error-greater accuracy and precision  
            True absolute error :
= Result of measurement – True Value
            Apparent Absolute error :
= Result of measurement – Arithmetic Value
2. Relative error :
It is defined as the results of the absolute error and the value of comparison used for 450
calculation of that absolute error.
01-relative error-percentage relative error-absolute error
Causes of Errors :
1. Calibration Error:
             These are caused due to the variation in the calibrated scale from it’s normal value.
01-error-calibration test-calibration of gauges-calibrate a instrument by a standard gauge-Panametrics-NDT_Calibration_Test_Blocks
2. Environmental Error :
                         These are caused due to humidity condition,Temperature and altitude.
3. Assembly Error:
i. Displaced Scale (incorrect Fitting)
ii. Non –uniform division of the scale.
iii. Due to bent /distorted pointer.
01-parallex error-eye fault defect by human-bias error
4. Random Error:
                            Naturally Occurred
                            No specific reasons
5. Systematic errors (or) Bias errors:
                                               These are caused due to repeated readings.
01-errors-types of error-systematic error-random error-bias error
6. Chaotic errors :
                           These are caused due to vibrations,noises and shocks.

Terms in Engineering Measurements

Calibration:
01-the weighing scale-weighing machines-balance-calibration example
If a known input is given to the measurement system the output deviates from the given input, the corrections are made in the instrument and then the output is measured. This process is called “Calibration”.
Sensitivity:
Sensitivity is the ratio of change in the output signal to the change in the input signal.
Readability:
01-electroniccaliper-VERNIER CALIPER-DIGITAL VERNIER CALIPER-DIRECT MEASUREMENTS-ACCURATE-PRECISION MEASUREMENTS-CALIBRATED INSTRUMENTS-readability
Refers to the ease with which the readings of a measuring instrument can be read.
True size:
Theoretical size of a dimension which is free from errors.
Actual size:
Size obtained through measurement with permissible error.
01-true size-actual size-feet size-example-shoe-footwear
Hysteresis:
All the energy put into the stressed component when loaded is not recovered upon unloading. so the output of measurement partially depends on input called Hysteresis.
01-tachometer-digital tachometer-hysteresis due to pressure of force
Range:
The physical variables that are measured between two values. One is the higher calibration value Hc and the other is Lower value Lc.
01-range - read values from 0 to 11000 rpm - bezel meter - tachometer
Span:
The algebraic difference between higher calibration values to lower calibration values.
Resolution:
The minimum value of the input signal is required to cause an appreciable change in the output known as resolution.
Dead Zone:
It is the largest change in the physical variable to which the measuring instrument does not respond.
Threshold:
The minimum value of input signal that is required to make a change or start from zero.
01-threshold-minimum input given to start the engine-bike kick start action
Backlash:
The maximum distance through which one part of the instrument is moved without disturbing the other part.
01-backlash - continuous rotation possible without applying brake-SINGLE 3-PHASE AC ASYNCHRONOUS ELECTRIC MOTOR
Response Time:
The time at which the instrument begins its response for a change in the measured quantity.
Repeatability:
The ability of the measuring instrument to repeat the same results during the act measurements for the same quantity is known as repeatability.
Bias:
It is a characteristic of a measure or measuring instruments to give indications of the value of a measured quantity for which the average value differs from true value.
Magnification:
It means the magnitude of output signal of measuring instrument many times increases to make it more readable.
01-magnification-objective lens-magnify-loupe-ring
Drift:
If an instrument does not reproduce the same reading at different times of measurement for the same input signal, it is said to be measurement drift.
Reproducibility:
It is the consistency of pattern of variation in measurement. When individual measurements are carried out the closeness of the agreement between the results of measurements of the same quantity.
Uncertainty:
The range about the measured value within the true value of the measured quantity is likely to lie at the stated level of confidence.
Traceability:
It is nothing establishing a calibration by step by step comparison with better standards.
01-traceability-calibration step by step-vacuum calibration
Parallax:
An apparent change in the position of the index relative is to the scale marks.


01-parallax-error-measurement of length-eye view

"2.5G"

2.5G, which stands for "second and a half generation," is a cellular wireless technology developed in between its predecessor, 2G, and its successor, 3G.
01-2.5G technology-video-conferencing-with-3g-technology
"2.5G" is an informal term, invented solely for marketing purposes, unlike "2G" or "3G" which are officially defined standards based on those defined by the International Telecommunication (ITU). The term "2.5G" usually describes a 2G cellular system combined with General Packet Radio Services (GPRS), or other services not generally found in 2G or 1G networks.
Wireless telecommunication technology like CDMA200 1x-RTT, Enhanced Data Rates for GSM Evolution (EDGE) or Enhanced General Packet Radio Service (EGPRS), since they have data transmission rates of 144 kbps or higher, may qualify as 3G technology. However, they are usually classified as 2.5G technology because they have slower network speeds than most 3G services.
01-GPRS-gsm-network-cdma network
GPRS is a service commonly associated with 2.5G technology. It has data transmission rates of 28 kbps or higher. GPRS came after the development of the Global System for Mobile (GSM) service, which is classified as 2G technology, and it was succeeded by the development of the Universal Mobile Telecommunication Service (UMTS), which is classified as 3G technology.
A 2.5G system may make use of 2G system infrastructure, but it implements a packet-switched network domain in addition to a circuit-switched domain. This does not necessarily give 2.5G an advantage over 2G in terms of network speed, because bundling of timeslots is also used for circuit-switched data services (HSCSD).

What is 2G Technology | Second Generation Wireless Technology | Digital Radio Signals

2G refers to second generation wireless telecommunication technology. While its predecessor, 1G, made use of analog radio signals, 2G uses digital radio signals.
01-mobility-speed-wireless telecommunication-technology improvement of mobile  network
Based on what type of multiplexing (the process of combining multiple digital data streams into one signal) is employed, 2G technologies may be categorized by whether they are based on time division multiple access (TDMA) or code division multiple access (CDMA).
TDMA-based 2G standards include the following: Global System for Mobile communications (GSM), used worldwide; Integrated Digital Enhanced Network (IDEN), developed by Motorola and used in the United States and Canada; Interim Standard 136 (IS-136) or Digital Advanced Mobile Phone System (D-AMPS), used in North and South America; and Personal Digital Cellular (PDC), used in Japan.
IS-95, on the other hand, is CDMA-based. It was developed by Qualcomm, and is alternately known as TIA-EIA-95 or cdmaOne.
2G makes use of a CODEC (compression-decompression algorithm) to compress and multiplex digital voice data. Through this technology, a 2G network can pack more calls per amount of bandwidth as a 1G network. 2G cellphone units were generally smaller than 1G units, since they emitted less radio power.
01-growth-history of GSM-and its relative technologies
Another advantage of 2G over 1G is that the battery life of a 2G handset lasts longer, again due to the lower-powered radio signals. Since it transmitted data through digital signals, 2G also offered additional services such as SMS and e-mail. Its lower power emissions also made 2G handsets safer for consumers to use.
Error checking, a feature allowed by digital voice encoding, improved sound quality by reducing dynamic and lowering the noise floor. Digital voice encoding also made the calls less susceptible to unwanted eavesdropping from third parties, due to the use of radio scanners.
2G, however, does have its disadvantages as well. In comparison to 1G’s analog signals, 2G’s digital signals are very reliant on location and proximity. If a 2G handset made a call far away from a cell tower, the digital signal may not be enough to reach it.
While a call made from a 1G handset had generally poor quality than that of a 2G handset, it survived longer distances. This is due to the analog signal having a smooth curve compared to the digital signal, which had a jagged, angular curve. As conditions worsen, the quality of a call made from a 1G handset would gradually worsen, but a call made from a 2G handset would fail completely.