Home > Catalogue > Magnetic Components > Specialty Transformers> ice components

Specialty Transformers

Results:
Specialty Transformers Results:
Filter Results: -1/11
Comprehensive
Price Priority
Stock Priority
Image
Part Number
Manufacturer
Description
Availability
Unit Price
Quantity
Operation
GT07-110-027
ice components
340μH 1:1:1 SMD mount 10mm(length)*6.35mm(height)
Quantity: 643
Ship Date: 6-13 working days
1+ $5.014
10+ $3.916
25+ $3.201
50+ $3.0456
100+ $3.024
500+ $2.5596
- +
x $5.014
Ext. Price: $5.01
MOQ: 1
Mult: 1
SPQ: 1
CT02-200
ice components
7.1mH 1200 SMD mount 6.4mm(length)*4.6mm(height)
Quantity: 882
Ship Date: 6-13 working days
1+ $3.6455
10+ $2.849
25+ $2.42
50+ $2.3004
100+ $2.2356
250+ $2.1816
1000+ $1.7955
- +
x $3.6455
Ext. Price: $3.64
MOQ: 1
Mult: 1
SPQ: 1
GT17005
ice components
50μH 1:3.5 SMD mount 10mm(length)*6.35mm(height)
Quantity: 35
Ship Date: 6-13 working days
1+ $3.3695
10+ $2.76
25+ $2.244
50+ $2.1384
100+ $2.106
500+ $1.782
5000+ $1.6695
- +
x $3.3695
Ext. Price: $3.36
MOQ: 1
Mult: 1
SPQ: 1
CT05-100
ice components
18mH 500KHz 1100 Through hole mounting 17.6mm(length)*3mm(height)
Quantity: 1486
Ship Date: 6-13 working days
1+ $4.347
10+ $3.399
25+ $2.882
50+ $2.7432
100+ $2.6568
250+ $2.6028
1000+ $2.142
- +
x $4.347
Ext. Price: $4.34
MOQ: 1
Mult: 1
SPQ: 1
CT09-020
ice components
160μH 1:20 SMD mount 10mm(length)*6.35mm(height)
Quantity: 251
Ship Date: 6-13 working days
1+ $3.1395
10+ $2.5415
25+ $2.079
50+ $1.9764
100+ $1.9224
250+ $1.8468
500+ $1.7604
1000+ $1.6695
5000+ $1.554
- +
x $3.1395
Ext. Price: $3.13
MOQ: 1
Mult: 1
SPQ: 1
GT06-122-053
ice components
162μH 12.52.5 PULSE TRANSFORMER SMD mount 16mm(length)*7.5mm(height)
Quantity: 1045
Ship Date: 6-13 working days
1+ $3.105
10+ $2.5185
25+ $2.057
50+ $1.9548
100+ $1.9008
250+ $1.8252
500+ $1.7928
1200+ $1.6275
4800+ $1.533
- +
x $3.105
Ext. Price: $3.10
MOQ: 1
Mult: 1
SPQ: 1
CT10-100
ice components
Current Sense Transformer
Quantity: 1022
Ship Date: 6-13 working days
1+ $3.772
10+ $2.959
25+ $2.508
50+ $2.3868
100+ $2.3112
250+ $2.2248
500+ $2.1492
1000+ $2.0265
- +
x $3.772
Ext. Price: $3.77
MOQ: 1
Mult: 1
SPQ: 1
CT05-1000
ice components
1.9H 500KHz 11000 Through hole mounting 17.6mm(length)*15mm(height)
Quantity: 889
Ship Date: 6-13 working days
1+ $6.164
10+ $4.708
25+ $4.3956
50+ $4.2984
100+ $4.158
242+ $3.9852
484+ $3.7485
1089+ $3.633
- +
x $6.164
Ext. Price: $6.16
MOQ: 1
Mult: 1
SPQ: 1
CT02-050
ice components
440μH 150 SMD mount 6.4mm(length)*4.6mm(height)
Quantity: 953
Ship Date: 6-13 working days
1+ $3.404
10+ $2.7945
25+ $2.266
50+ $2.1492
100+ $2.0844
250+ $2.0412
1000+ $1.68
- +
x $3.404
Ext. Price: $3.40
MOQ: 1
Mult: 1
SPQ: 1
CT09-060
ice components
1.4mH 1:60 SMD mount 10mm(length)*6.35mm(height)
Quantity: 416
Ship Date: 6-13 working days
1+ $2.898
10+ $2.346
25+ $1.925
50+ $1.859
100+ $1.8036
500+ $1.5228
5000+ $1.428
- +
x $2.898
Ext. Price: $2.89
MOQ: 1
Mult: 1
SPQ: 1
GT06-111-049
ice components
882μH 111 PULSE TRANSFORMER SMD mount 16mm(length)*7.5mm(height)
Quantity: 30
Ship Date: 7-12 working days
1+ $5.3676
10+ $4.6738
25+ $4.4857
50+ $4.3476
100+ $4.2129
- +
x $5.3676
Ext. Price: $5.36
MOQ: 1
Mult: 1
SPQ: 1

Specialty Transformers

Other Transformers refers to a class of neural network architectures that extend the capabilities of the original Transformer model, which was introduced in the paper "Attention Is All You Need" by Vaswani et al. in 2017. The original Transformer model revolutionized the field of natural language processing (NLP) with its use of self-attention mechanisms to process sequences of data, such as text or time series.

Definition:
Other Transformers are variations or extensions of the basic Transformer architecture, designed to address specific challenges or to improve performance in various tasks. They often incorporate additional layers, attention mechanisms, or training techniques to enhance the model's capabilities.

Functions:
1. Enhanced Attention Mechanisms: Some Transformers introduce new types of attention, such as multi-head attention, which allows the model to focus on different parts of the input sequence simultaneously.
2. Positional Encoding: To preserve the order of sequence data, positional encodings are added to the input embeddings.
3. Layer Normalization: This technique is used to stabilize the training of deep networks by normalizing the inputs to each layer.
4. Feedforward Networks: Each Transformer layer includes a feedforward neural network that processes the attention outputs.
5. Residual Connections: These connections help in training deeper networks by adding the output of a layer to its input before passing it to the next layer.

Applications:
- Natural Language Understanding (NLU): For tasks like sentiment analysis, question answering, and text classification.
- Machine Translation: To translate text from one language to another.
- Speech Recognition: Transcribing spoken language into written text.
- Time Series Analysis: For forecasting and pattern recognition in sequential data.
- Image Recognition: Some Transformers have been adapted for computer vision tasks.

Selection Criteria:
When choosing an Other Transformer model, consider the following:
1. Task Specificity: The model should be suitable for the specific task at hand, whether it's translation, summarization, or classification.
2. Data Size and Quality: Larger and more diverse datasets may require more complex models.
3. Computational Resources: More sophisticated models require more computational power and memory.
4. Training Time: Complex models may take longer to train.
5. Performance Metrics: Consider the model's performance on benchmarks relevant to your task.
6. Scalability: The model should be able to scale with the size of the data and the complexity of the task.

In summary, Other Transformers are a diverse family of models that build upon the foundational concepts of the original Transformer to address a wide range of challenges in machine learning and artificial intelligence. The choice of a specific model depends on the requirements of the task, the available data, and the computational resources.
Please refer to the product rule book for details.