Presearch—a decentralized and privacy-oriented search engine—has simply launched PreGPT 2.0, marking the agency’s latest attempt to challenge Big Tech’s supremacy in the AI robot area.
Improved language concepts and a wider variety of open-source AI selections were all added as a result of the new release, which were run on a system of distributed servers rather than centralized data centres.
” Why am I so excited? Because PreGPT 2.0 is so strong and uncontrolled, that it has the ability to ultimately destroy the echo chamber effect that has long been controlling conventional knowledge, amplifying the flock instinct into blinded conformity”, Brenden Tacon, innovation and operations lead for Presearch, told .
A$ 2 monthly basic plan that runs Mistral AI’s 7B model and a$ 5 pro plan powered by Venice are included with the upgraded chatbot. ai’s more powerful LLMs. Both choices promise to keep user data secret and conversations anonymous, with chats completely deleted after deletion.
PreGPT 2.0’s model lineup features six of the most well-known names in the open-source AI space: Meta’s Llama-3.1-405b ( a gigantic model ), Llama-3.2-3b ( a very small model built for efficiency ) and Llama-3.3-70b ( its latest LLM), Alibaba’s Qwen 32b.
It also leverages the aged Dolphin 2.9 model, formerly known in Iot circles for being completely uncensored and powerful—and great at roleplay. Additionally, it appears that the company has adjusted the Mistral 7B design to provide a customized version.
” This model gracefully handles a framework of 8, 000 Tokens, which equates to about 5, 000 thoughts, and you will be throttled to 1000 emails per month”, according to the company’s website.
This results in the unit having a memory of 5, 000 thoughts and being unable to properly process causes that are that much.
What is Presearch?
Presearch, which went lived in 2018 and was initially in beta, is essentially a project that aims to reimagine search engine architecture using distributed technology.
Through a web of separate networks, the app processed more than 12 million regular searches. Each network operator staked Post tokens and provided computing energy to the network, enabling a self-sustaining ecosystem that developed inevitably with demand.
The thought is that a decentralized network would make Google’s business model, which is harder to profil users, and that it would help create a more open and healthy business model.
The product’s advertising model is also unique from what you see in Google or Bing, for example.
Instead of bidding wars for words, ads staked Post tokens to get presence. The better their selection will be the more tokens they play, a program that reduces sign circulation while producing predictable income.
A part of these currencies get burned occasionally, eventually decreasing the total source from its present 590 million Prior in circulation.
PreGPT 2.0 made use of this distributed network by working with Venice. ai, a privacy-conscious AI services provider, and Salad.com, a group that shares decentralized Graphic power.
On Venice, the skilled level is active. ai’s high-performance community, while the simple program is supported by Salad.com’s distributed GPU system.
Both pathways encrypt customer interactions and refrain from holding talk logs, upholding Presearch’s dedication to protection.
PRE’s tokenomics keeps the program running smoothly. Network operators are compensated based on their play size and research volume, while users generate up to 8 tokens per day for search queries.
This seems to be a win-win situation, at least in theory, where both people and advertisers are adequately compensated while supporting the ecosystem’s growth.
PreGPT 2.0 is a distinct AI have added to Presearch’s kit, the business remains focused on its primary objective of decentralized, personal search.
The bot integration is meant to enhance the search expertise without overshadowing it.
The goal is to make the entire system user-friendly and interested in using AI resources in their daily lives.
Hands-On with PreGPT 2.0: Promise and Limitations
PreGPT 2.0 tests revealed a worthy bot that prioritizes function over display. The program felt more uncluttered than Venice’s. Despite lacking the image-generation capabilities that have become common abroad, it was similar to Touching Chat.
The integration of a system fast feature lets users fine-tune the AI’s behavior through custom instructions, which is good for getting more accurate responses—a sound system fast can significantly increase a model’s performance.
People who have previously experimented with various chatbots will find the overall experience familiar.
This wasn’t a revolutionary leap in AI capability; it was a privacy-focused application of existing open-source models, which are frequently less potent than popular alternatives like GPT-4o or Deepseek.
The platform only handles plain text. It can craft a bedtime story or summarize trends, but it lacks support for Excel documents and cannot properly format CSV files, PDFs, or third-party docs.
Instead, users must actually copy the contents of a sheet and paste it, which is far from ideal. There is no reason to worry about those who associate decentralization with slow speeds.
The replies were fast, and the chatbot never hung. However, the models offered the standard standards for open-source LLMs that aren’t really dominating the LLM Arena—Lama 3. 1 405b is currently in the 27th position and the most powerful model on Presearch’s roster.
It’s not bad, but it’s also not impressive by today’s standards.
There are currently some open-source implementations that are significantly more powerful and presumably of comparable sizes.
For example, Llama-3.1-Nemotron-70B-Instruct could easily substitute the newer ( but not better ) Llama-3.3-70b, and Deepseek R1 is leaps ahead of Meta’s Llama 3.1 405b, being the best open-source model to date.
Overall, the experience was pleasant, the models performed as expected, and the interface was easier to use than Venice AI, its main competitor.
This feature is undoubtedly worthwhile if you are looking for a privacy solution or want to try every AI tool on the market right now. Just take into consideration that the search engine won’t replace Google, and the AI chatbot won’t replace ChatGPT—at least not yet.
edited by Sebastian Sinclair and Josh Quittner
Generally Intelligent Newsletter
A generative AI model’s generative AI model, Gen, tells a weekly AI journey.