Home > Catalogue > Magnetic Components > Specialty Transformers> HAHN

Specialty Transformers

Results:
Specialty Transformers Results:
Filter Results: -1/11
Comprehensive
Price Priority
Stock Priority
Image
Part Number
Manufacturer
Description
Availability
Unit Price
Quantity
Operation
BV EI 304 2818
HAHN
Printed-Circuit-Board Transformers
Quantity: 100
Ship Date: 10-15 working days
50+ $12.9024
100+ $12.7584
150+ $12.2976
250+ $9
500+ $5.8752
1000+ $3.0816
5000+ $2.6928
- +
x $12.9024
Ext. Price: $645.12
MOQ: 50
Mult: 50
SPQ: 50
BV EI 306 2065
HAHN
Printed-Circuit-Board Transformers
Quantity: 300
Ship Date: 10-15 working days
150+ $2.0016
350+ $1.9728
800+ $1.9152
- +
x $2.0016
Ext. Price: $500.40
MOQ: 250
Mult: 50
SPQ: 50
BV EI 304 2089
HAHN
Printed-Circuit-Board Transformers
Quantity: 200
Ship Date: 10-15 working days
25+
100+ $9.4032
200+ $9.0864
250+ $7.2288
500+ $5.4864
1000+ $3.672
5000+ $3.3408
- +
x $9.4032
Ext. Price: $940.32
MOQ: 100
Mult: 50
SPQ: 50
BV EI 305 2100
HAHN
Printed-Circuit-Board Transformers
Quantity: 950
Ship Date: 10-15 working days
23+
50+ $14.4144
150+ $13.9824
250+ $10.7424
500+ $7.632
1000+ $4.9824
5000+ $4.6368
- +
x $14.4144
Ext. Price: $720.72
MOQ: 50
Mult: 50
SPQ: 50
BV EI 306 2103
HAHN
Printed-Circuit-Board Transformers
Quantity: 12
Ship Date: 10-15 working days
50+ $13.2624
150+ $12.7872
250+ $9.6336
500+ $6.5808
1000+ $3.9744
5000+ $3.5856
- +
x $13.2624
Ext. Price: $663.12
MOQ: 50
Mult: 50
SPQ: 50
BV EI 301 3584
HAHN
Printed-Circuit-Board Transformers
Quantity: 20
Ship Date: 10-15 working days
100+ $7.776
250+ $5.8176
500+ $4.2912
1000+ $3.528
5000+ $2.8656
- +
x $7.776
Ext. Price: $777.60
MOQ: 100
Mult: 50
SPQ: 50
BV EI 662 1089
HAHN
EI 66 PRINTETD CIRCUIT-BOARD TRANSFORMERS, OUTPUT CAPACITY UP TO 50 VA
Quantity: 4
Ship Date: 10-15 working days
18+ $36.5184
45+ $35.208
252+ $29.2032
504+ $23.544
1008+ $18.288
5004+ $17.7552
- +
x $36.5184
Ext. Price: $657.33
MOQ: 18
Mult: 9
SPQ: 9
BV EI 306 3603
HAHN
Power Transformer 3.6VA 10 Terminal Pin Thru-Hole
Quantity: 3
Ship Date: 10-15 working days
24+
50+ $9.7056
100+ $9.6192
200+ $9.3312
250+ $7.344
500+ $5.544
1000+ $4.0608
5000+ $3.6432
- +
x $9.7056
Ext. Price: $485.28
MOQ: 50
Mult: 50
SPQ: 50
BV EI 306 3596
HAHN
Printed Circuit-Board Transformers
Quantity: 200
Ship Date: 10-15 working days
23+
300+ $1.237
600+ $1.2067
1400+ $1.1347
- +
x $1.237
Ext. Price: $494.80
MOQ: 400
Mult: 50
SPQ: 50
BV EI 302 2025
HAHN
EI 30 TRANSFORMERS CLASS F
Quantity: 0
Ship Date: 10-15 working days
100+ $3.3264
200+ $3.2832
500+ $3.1824
5000+ $2.7936
- +
x $3.3264
Ext. Price: $498.96
MOQ: 150
Mult: 50
SPQ: 50
BV EI 306 3367
HAHN
Printed Circuit-Board Transformers
Quantity: 0
Ship Date: 10-15 working days
100+ $4.0752
200+ $4.0176
400+ $3.9024
5000+ $3.4992
- +
x $4.0752
Ext. Price: $611.28
MOQ: 150
Mult: 50
SPQ: 50

Specialty Transformers

Other Transformers refers to a class of neural network architectures that extend the capabilities of the original Transformer model, which was introduced in the paper "Attention Is All You Need" by Vaswani et al. in 2017. The original Transformer model revolutionized the field of natural language processing (NLP) with its use of self-attention mechanisms to process sequences of data, such as text or time series.

Definition:
Other Transformers are variations or extensions of the basic Transformer architecture, designed to address specific challenges or to improve performance in various tasks. They often incorporate additional layers, attention mechanisms, or training techniques to enhance the model's capabilities.

Functions:
1. Enhanced Attention Mechanisms: Some Transformers introduce new types of attention, such as multi-head attention, which allows the model to focus on different parts of the input sequence simultaneously.
2. Positional Encoding: To preserve the order of sequence data, positional encodings are added to the input embeddings.
3. Layer Normalization: This technique is used to stabilize the training of deep networks by normalizing the inputs to each layer.
4. Feedforward Networks: Each Transformer layer includes a feedforward neural network that processes the attention outputs.
5. Residual Connections: These connections help in training deeper networks by adding the output of a layer to its input before passing it to the next layer.

Applications:
- Natural Language Understanding (NLU): For tasks like sentiment analysis, question answering, and text classification.
- Machine Translation: To translate text from one language to another.
- Speech Recognition: Transcribing spoken language into written text.
- Time Series Analysis: For forecasting and pattern recognition in sequential data.
- Image Recognition: Some Transformers have been adapted for computer vision tasks.

Selection Criteria:
When choosing an Other Transformer model, consider the following:
1. Task Specificity: The model should be suitable for the specific task at hand, whether it's translation, summarization, or classification.
2. Data Size and Quality: Larger and more diverse datasets may require more complex models.
3. Computational Resources: More sophisticated models require more computational power and memory.
4. Training Time: Complex models may take longer to train.
5. Performance Metrics: Consider the model's performance on benchmarks relevant to your task.
6. Scalability: The model should be able to scale with the size of the data and the complexity of the task.

In summary, Other Transformers are a diverse family of models that build upon the foundational concepts of the original Transformer to address a wide range of challenges in machine learning and artificial intelligence. The choice of a specific model depends on the requirements of the task, the available data, and the computational resources.
Please refer to the product rule book for details.