Home U.S. Coin Forum

Anyone know what Bert is?

Coin FinderCoin Finder Posts: 7,517 ✭✭✭✭✭

Comments

  • coinandcurrency242coinandcurrency242 Posts: 1,988 ✭✭✭✭
    edited November 14, 2025 10:22AM

    They are a what knot seller (trader bea) and that is their special label they get.

    Positive BST as a seller: Namvet69, Lordmarcovan, Bigjpst, Soldi, mustanggt, CoinHoader, moursund, SufinxHi, al410, JWP

  • yosclimberyosclimber Posts: 5,151 ✭✭✭✭✭
    edited November 14, 2025 10:29AM
  • P0CKETCHANGEP0CKETCHANGE Posts: 3,081 ✭✭✭✭✭

    Seems like their marketing is working given how many threads there have been about it here.

    Nothing is as expensive as free money.

  • BLUEJAYWAYBLUEJAYWAY Posts: 10,638 ✭✭✭✭✭

    Sorry. Only know BART:Bay Area Rapid Transit.

    Successful transactions:Tookybandit. "Everyone is equal, some are more equal than others".
  • pcgsregistrycollectorpcgsregistrycollector Posts: 2,152 ✭✭✭✭✭

    BERT is a language model developed by Google that uses a transformer architecture to understand the context of words in a sentence from both directions, left-to-right and right-to-left. It is pre-trained on a massive amount of text data and can then be fine-tuned for various specific tasks with minimal effort. This makes it highly effective for applications like improved search queries, sentiment analysis, and question answering.

    Key characteristics
    Bidirectional training: Unlike older models that processed text in a single direction, BERT analyzes words in the context of the entire sentence, from both the left and the right.
    Pre-training and fine-tuning: BERT is first pre-trained on general language patterns. Developers can then "fine-tune" it with a small amount of task-specific data to make it perform well on new tasks like sentiment analysis or question answering.
    Transformer architecture: It is built on the transformer model, which uses a mechanism called "self-attention" to weigh the importance of different words in a sentence.
    Auto-encoding: It uses a type of auto-encoding language model to create a vector representation for each word based on its contextual information.
    Common applications
    Search engines: It helps search engines understand the context of user queries, leading to more accurate results.
    Question answering: It can find the specific span of text that answers a question within a larger document.
    Sentiment analysis: It can determine the emotional tone (positive, negative, or neutral) of text, such as customer reviews.
    Text classification: It can categorize text into different groups.
    Named Entity Recognition (NER): It identifies and categorizes entities like names of people, organizations, and locations within a text.
    Chatbots and virtual assistants: It enables more natural and context-aware conversations.

    Proud follower of Christ!

  • Morgan WhiteMorgan White Posts: 11,855 ✭✭✭✭✭

  • emeraldATVemeraldATV Posts: 5,122 ✭✭✭✭✭

    .......................................

    .Oh,my !................................
    "The BERT label feature a
    variety of designs
    & themes to enhance
    the presentation
    of different
    coins and sets."

  • seatedlib3991seatedlib3991 Posts: 1,423 ✭✭✭✭✭

    A confusing thread for a coin site. I assumed Bert meant the inner letters of LI BERT Y. James

  • ScarsdaleCoinScarsdaleCoin Posts: 5,386 ✭✭✭✭✭

    Bert was a great hobo nickel carver

    Jon Lerner - Scarsdale Coin - www.CoinHelp.com

Leave a Comment

BoldItalicStrikethroughOrdered listUnordered list
Emoji
Image
Align leftAlign centerAlign rightToggle HTML viewToggle full pageToggle lights
Drop image/file