NVIDIA Unveils Nemotron-CC: A Trillion-Token Dataset for Enhanced LLM Training

By: cryptosheadlines|2025/05/08 12:00:08
0
Share
copy
Airdrop Is Live CaryptosHeadlines Media Has Launched Its Native Token CHT. Airdrop Is Live For Everyone, Claim Instant 5000 CHT Tokens Worth Of $50 USDT. Join the Airdrop at the official website, CryptosHeadlinesToken.com Joerg Hiller May 07, 2025 15:38 NVIDIA introduces Nemotron-CC, a trillion-token dataset for large language models, integrated with NeMo Curator. This innovative pipeline optimizes data quality and quantity for superior AI model training. NVIDIA has integrated its Nemotron-CC pipeline into the NeMo Curator, offering a groundbreaking approach to curating high-quality datasets for large language models (LLMs). The Nemotron-CC dataset leverages a 6.3-trillion-token English language collection from Common Crawl, aiming to enhance the accuracy of LLMs significantly, according to NVIDIA.Advancements in Data CurationThe Nemotron-CC pipeline addresses the limitations of traditional data curation methods, which often discard potentially useful data due to heuristic filtering. By employing classifier ensembling and synthetic data rephrasing, the pipeline generates 2 trillion tokens of high-quality synthetic data, recovering up to 90% of content lost by filtering.Innovative Pipeline FeaturesThe pipeline’s data curation process begins with HTML-to-text extraction using tools like jusText and FastText for language identification. It then applies deduplication to remove redundant data, utilizing NVIDIA RAPIDS libraries for efficient processing. The process includes 28 heuristic filters to ensure data quality and a PerplexityFilter module for further refinement.Quality labeling is achieved through an ensemble of classifiers that assess and categorize documents into quality levels, facilitating targeted synthetic data generation. This approach enables the creation of diverse QA pairs, distilled content, and organized knowledge lists from the text.Impact on LLM TrainingTraining LLMs with the Nemotron-CC dataset yields significant improvements. For instance, a Llama 3.1 model trained on a 1 trillion-token subset of Nemotron-CC achieved a 5.6-point increase in the MMLU score compared to models trained on traditional datasets. Furthermore, models trained on long horizon tokens, including Nemotron-CC, saw a 5-point boost in benchmark scores.Getting Started with Nemotron-CCThe Nemotron-CC pipeline is available for developers aiming to pretrain foundation models or perform domain-adaptive pretraining across various fields. NVIDIA provides a step-by-step tutorial and APIs for customization, enabling users to optimize the pipeline for specific needs. The integration into NeMo Curator allows for seamless development of both pretraining and fine-tuning datasets.For more information, visit the NVIDIA blog.Image source: Shutterstock Source link

You may also like

Electric Capital: Tracking 501 types of yield-generating RWA assets, we discovered these patterns

From private credit to GPU leasing, from catastrophe bonds to music royalties, the range of tokenizable assets is much richer than the market perceives. However, the biggest challenge is not technology, but distribution—existing RWAs heavily rely on a few large deployers, and the concentration of ri...

Those who are cut off by AI will not disappear; they will become the creators of the next round of the economy

AI is not eliminating people, but rather the superstition of "stable careers": those who break the shackles of organizations and understand how to rewrite themselves are ushering in the ultimate revenge.

Stablecoins reshaping cross-border payments in Asia? Strategic panorama and investment opportunity analysis

With the popularity of local payment channels, the costs of traditional transfers have been significantly reduced, and the fees are now mainly concentrated in the domestic settlement phase, which is precisely what stablecoins cannot bypass.

Zuckerberg is building an AI agent to help him as CEO

Zuckerberg is reported to be personally developing a "CEO proxy" to accelerate information acquisition and reduce management layers.

Bloomberg: Swiss Private Bank Old Guard Rifts, Is Bitcoin the Spark?

For Marc Syz, this is both a bet on the digital asset track and a complete break from Switzerland's long-established private banking dynasty.

Zuckerberg is building an AI assistant to help him be CEO

Mark Zuckerberg has been reportedly personally developing a "CEO Proxy" to speed up information flow and reduce management layers.

Popular coins

Latest Crypto News

Read more