Show rating breakdown
Save to My Lists
Unclaimed
Unclaimed

Top Rated BERT Alternatives

BERT Reviews & Product Details

Verified User in Computer Software
UC
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: Organic
What do you like best about BERT?

Very good NLP model that is open-source. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Large number of parameters makes the model difficult to train without several GPUs. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

It solves the problem of needing an open-source LLM to train for a specific task. It allows for using the out-of-the-box model or for fine-tuning, which allows users to create a new model that is specifically tuned for their data. Review collected by and hosted on G2.com.

BERT Overview

What is BERT?

BERT, short for Bidirectional Encoder Representations from Transformers, is a machine learning (ML) framework for natural language processing. In 2018, Google developed this algorithm to improve contextual understanding of unlabeled text across a broad range of tasks by learning to predict text that might come before and after (bi-directional) other text.

BERT Details
Discussions
BERT Community
Show LessShow More
Product Description

BERT, short for Bidirectional Encoder Representations from Transformers, is a machine learning (ML) framework for natural language processing. In 2018, Google developed this algorithm to improve contextual understanding of unlabeled text across a broad range of tasks by learning to predict text that might come before and after (bi-directional) other text.


Seller Details
Seller
Google
Year Founded
1998
HQ Location
Mountain View, CA
Twitter
@google
32,553,933 Twitter followers
LinkedIn® Page
www.linkedin.com
301,875 employees on LinkedIn®
Ownership
NASDAQ:GOOG
Phone
+1 (650) 253-0000
Total Revenue (USD mm)
$182,527
Description

Organize the world’s information and make it universally accessible and useful.

Recent BERT Reviews

Aniket s.
AS
Aniket s.Small-Business (50 or fewer emp.)
5.0 out of 5
"very helpful"
medical code representationambiguous language in text by using surrounding text to establish cont
Verified User
U
Verified UserMid-Market (51-1000 emp.)
5.0 out of 5
"I've used it in my several project so certainly recommend"
Easy to use for any one and very efficient
Rakesh K.
RK
Rakesh K.Mid-Market (51-1000 emp.)
4.5 out of 5
"A Satisfied BERT User"
An open source product by Google. Very easy to implement and work with. It is very flexible to customise for any specific tasks that is very helpfu...
Security Badge
This seller hasn't added their security information yet. Let them know that you'd like them to add it.
0 people requested security information

BERT Media

Answer a few questions to help the BERT community
Have you used BERT before?
Yes

53 out of 54 Total Reviews for BERT

4.4 out of 5
The next elements are filters and will change the displayed results once they are selected.
Search reviews
Popular Mentions
The next elements are radio elements and sort the displayed results by the item selected and will update the results displayed.
Hide FiltersMore Filters
The next elements are filters and will change the displayed results once they are selected.
The next elements are filters and will change the displayed results once they are selected.
53 out of 54 Total Reviews for BERT
4.4 out of 5
53 out of 54 Total Reviews for BERT
4.4 out of 5

BERT Pros and Cons

How are these determined?Information
Pros and Cons are compiled from review feedback and grouped into themes to provide an easy-to-understand summary of user reviews.
Pros
Cons
G2 reviews are authentic and verified.
APOORV G.
AG
Software Engineer
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Its ability to capture contextual nuances in language is outstanding & allowing for more accurate and context-aware natural language understanding also. Its bidirectional approach and pre-training on extensive datasets contribute to its versatility across a spectrum of NLP tasks, making it a powerful tool in the field. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Its computational intensity, requiring substantial resources for training and inference. Also, it struggle with out-of-vocabulary words and might not handle long-range dependencies as effectively. Despite these limitations, ongoing research and advancements aim to address and mitigate these challenges in future models. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

It addresses challenges in natural language understanding by improving context comprehension and capturing intricate language nuances. This benefits me by enhancing the quality of responses I provide. BERT's bidirectional approach enables a better understanding of context, aiding in more accurate and context-aware language generation. This improvement results in more relevant and coherent answers, contributing to a more effective and satisfying user experience during our interactions. Review collected by and hosted on G2.com.

Bittu M.
BM
Technical Assistant
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Verified Current User
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

I have been using BERT for the last 3 months now, I give like precise and to-point answers to my daily activities, and as a chatbot, it gives completely relevant information like a mentor available for 24/7. I highly recommend this to everyone. I'm saving lots of time and effort using BERT. Review collected by and hosted on G2.com.

What do you dislike about BERT?

About the interface, as a Google product, it should look more classy. The information can be made more like human, as till now also it looks like machine generated. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

Bert help me debug the code and generate ideas for my project, also it wrote few code for me also, I use it for for my daily activity. Review collected by and hosted on G2.com.

Ruchin D.
RD
Senior Research Engineer
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

It's very easy to use and it have so many resources around it online that anyone get a very good grasp on it even without any background knowledge a out transformers.

Apart from ease of use it is also pretrained and we just need to fine tune as per our own task.

Also fine tuning is also pretty straightforward so yeah overall experience is really nice. Review collected by and hosted on G2.com.

What do you dislike about BERT?

There are only few things like it being computationally costly and like many other transformers it's mostly a black box when we try to understand why it gave out certain results.

Also since we are moving into age of AI, the token limitation in BERT actually makes its capabilities very limited. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

We get a prettained model which I just have to fine tune to my own problem Statement and it's really a life saver in that sense. Review collected by and hosted on G2.com.

Verified User in Internet
UI
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

- Great for tasks where bidirectional context is required, as opposed to GPT models where the context is unidirectional. Suitable for question-answering, analyzing small paragraphs of words, etc.

- Output is more trustworthy as compared to GPT models.

- Open source

- Easy to fine-tune for domain-specific applications as long as enough data is available. Review collected by and hosted on G2.com.

What do you dislike about BERT?

- It is extremely computationally expensive to build and deploy, especially to produce a quality output.

- Balancing the context window takes a lot of trial and error.

- With the arrival of GPT models, the lack of long context, i.e., limited context, is more noticeable than ever.

- Not suitable for large documents which require broader context.

- (not limited to BERT) A bit of a black box once implemented.

- Not a good choice for tasks where text has to be generated. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

I've used it for 2 purposes:

1. Figuring out which short passage best answers a question given a bunch of such passages.

2. Analysing a small chunk of passage to recognize which subject a user is talking in a very specific domain (required fine tuning). Review collected by and hosted on G2.com.

Abhishek K.
AK
Engineer II
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

It is best situated for the random searches that we do on a search engine and have to go through multiple pages to build our understanding. But with the new BERT engine it has become so efficient to look for queries and questions also in terms of seeking other text information. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Some of times the responses and like a general statement and we don't get exactly what we are looking Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

I've tried to use a similar engine in one of my projects with LLM, so I took help of Bert engine to understand how to optimise the PEFT and LoRA. Review collected by and hosted on G2.com.

Zayed R.
ZR
Programmer Analyst
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

The easy way to develop the NLP-based project for the Classification. The fine-tuning of the pre-trained model for the own dataset for training and testing the models. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It is good, but while we are using the large content for classification problems, it consumes a lot of computing power and this will lead to more cost. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

BERT is used to solve the problem of mail classification and entity extraction with the help of the pre-trained model to further fine-tune the model. which is best fit for our dataset to understanding the customer mail, which would help for automate the responses Review collected by and hosted on G2.com.

Rakesh K.
RK
BLS, Skill Development
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

An open source product by Google. Very easy to implement and work with. It is very flexible to customise for any specific tasks that is very helpful for a developer. It helps us in our day to day works with NLP. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It takes a lot of time to train the model. Hence Computationaly costly and need high end machines. High memory consumption is also there. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

We use BERT for our various NLP tasks. It answers the queries by our clients and sends periodic customized emails also. Review collected by and hosted on G2.com.

SHUBHAM G.
SG
Data Scientist
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Regarding bert , It is the first model which i tried for context based embedding.

I best thing about BERT is that it is simple to understand and retraining ,finetunning of it and support about it is available.Also there are 3 to 4 generalised english model is available Review collected by and hosted on G2.com.

What do you dislike about BERT?

As compare to distillbert it is heavy in size and bulky in nature as same thing of BERT can be possible with distillbert. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

using bert we impleted custom ner model which benefit us to identify personal info in resume. Review collected by and hosted on G2.com.

Verified User in Information Technology and Services
UI
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Verified Current User
Review source: G2 invite
Incentivized Review
(Original )Information
What do you like best about BERT?

* BERT generates high-quality texts by understanding the context around a word. I found good performance on document retrieval, and Question Answering.

* Finetuning BERT on custom data (or transfer learning) is very simple and gives good results. BERT inference is also faster than GPT.

* BERT has an extensive community and good support. Almost everyone around me has used BERT. Review collected by and hosted on G2.com.

What do you dislike about BERT?

In my experience with BERT, I think it still needs improvements:

* I found that BERT fine-tuning does not work well with large-scale datasets (e.g PILE)

* Its domain knowledge is constricted. It does not know much about domains such as healthcare, and education.

Hence, BERT can be considered enough for simple tasks, however, for complex tasks (e.g. open-ended generation, language translation etc.), it needs improvement.

I trust it's newer version will accommodate for major fixes. Wish them luck, Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

* search engine using BERT : retrieving documents similar to the query document.

* chat bot use case. The utility takes in the "user query" and automatically classifies the business department it should be sent to (i.e. Refund, Feedback, etc)

* sentiment classifier for product reviews Review collected by and hosted on G2.com.

Ojasi K.
OK
AI Engineering Analyst
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Worked on a use-case for detecting toxicity in prompts and their respective completions. BERT worked effectively providing us a very high accuracy of upto 92% for correct detections. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Can try including more classes except for the 6: toxic, severe-toxic, obscene, threat, insult and identity_hate. Some useful recommended classes: gender_bias, ethnicity_bias etc. Review collected by and hosted on G2.com.

What problems is BERT solving and how is that benefiting you?

We tried to recognize the toxic prompts and their respective completions using BERT. Were able to do so with upto 92% accuracy. Review collected by and hosted on G2.com.