Best Software for 2025 is now live!
Show rating breakdown
Save to My Lists
Unclaimed
Unclaimed

Top Rated BERT Alternatives

54 BERT Reviews

4.4 out of 5
The next elements are filters and will change the displayed results once they are selected.
Search reviews
Popular Mentions
The next elements are radio elements and sort the displayed results by the item selected and will update the results displayed.
Hide FiltersMore Filters
The next elements are filters and will change the displayed results once they are selected.
The next elements are filters and will change the displayed results once they are selected.
54 BERT Reviews
4.4 out of 5
54 BERT Reviews
4.4 out of 5

BERT Pros and Cons

How are these determined?Information
Pros and Cons are compiled from review feedback and grouped into themes to provide an easy-to-understand summary of user reviews.
Pros
Cons
G2 reviews are authentic and verified.
APOORV G.
AG
Software Engineer
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Its ability to capture contextual nuances in language is outstanding & allowing for more accurate and context-aware natural language understanding also. Its bidirectional approach and pre-training on extensive datasets contribute to its versatility across a spectrum of NLP tasks, making it a powerful tool in the field. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Its computational intensity, requiring substantial resources for training and inference. Also, it struggle with out-of-vocabulary words and might not handle long-range dependencies as effectively. Despite these limitations, ongoing research and advancements aim to address and mitigate these challenges in future models. Review collected by and hosted on G2.com.

Bittu M.
BM
Technical Assistant
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Verified Current User
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

I have been using BERT for the last 3 months now, I give like precise and to-point answers to my daily activities, and as a chatbot, it gives completely relevant information like a mentor available for 24/7. I highly recommend this to everyone. I'm saving lots of time and effort using BERT. Review collected by and hosted on G2.com.

What do you dislike about BERT?

About the interface, as a Google product, it should look more classy. The information can be made more like human, as till now also it looks like machine generated. Review collected by and hosted on G2.com.

Ruchin D.
RD
Senior Research Engineer
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

It's very easy to use and it have so many resources around it online that anyone get a very good grasp on it even without any background knowledge a out transformers.

Apart from ease of use it is also pretrained and we just need to fine tune as per our own task.

Also fine tuning is also pretty straightforward so yeah overall experience is really nice. Review collected by and hosted on G2.com.

What do you dislike about BERT?

There are only few things like it being computationally costly and like many other transformers it's mostly a black box when we try to understand why it gave out certain results.

Also since we are moving into age of AI, the token limitation in BERT actually makes its capabilities very limited. Review collected by and hosted on G2.com.

Verified User in Internet
UI
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

- Great for tasks where bidirectional context is required, as opposed to GPT models where the context is unidirectional. Suitable for question-answering, analyzing small paragraphs of words, etc.

- Output is more trustworthy as compared to GPT models.

- Open source

- Easy to fine-tune for domain-specific applications as long as enough data is available. Review collected by and hosted on G2.com.

What do you dislike about BERT?

- It is extremely computationally expensive to build and deploy, especially to produce a quality output.

- Balancing the context window takes a lot of trial and error.

- With the arrival of GPT models, the lack of long context, i.e., limited context, is more noticeable than ever.

- Not suitable for large documents which require broader context.

- (not limited to BERT) A bit of a black box once implemented.

- Not a good choice for tasks where text has to be generated. Review collected by and hosted on G2.com.

Abhishek K.
AK
Engineer II
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

It is best situated for the random searches that we do on a search engine and have to go through multiple pages to build our understanding. But with the new BERT engine it has become so efficient to look for queries and questions also in terms of seeking other text information. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Some of times the responses and like a general statement and we don't get exactly what we are looking Review collected by and hosted on G2.com.

Zayed R.
ZR
Programmer Analyst
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

The easy way to develop the NLP-based project for the Classification. The fine-tuning of the pre-trained model for the own dataset for training and testing the models. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It is good, but while we are using the large content for classification problems, it consumes a lot of computing power and this will lead to more cost. Review collected by and hosted on G2.com.

Rakesh K.
RK
BLS, Skill Development
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

An open source product by Google. Very easy to implement and work with. It is very flexible to customise for any specific tasks that is very helpful for a developer. It helps us in our day to day works with NLP. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It takes a lot of time to train the model. Hence Computationaly costly and need high end machines. High memory consumption is also there. Review collected by and hosted on G2.com.

SHUBHAM G.
SG
Data Scientist
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Regarding bert , It is the first model which i tried for context based embedding.

I best thing about BERT is that it is simple to understand and retraining ,finetunning of it and support about it is available.Also there are 3 to 4 generalised english model is available Review collected by and hosted on G2.com.

What do you dislike about BERT?

As compare to distillbert it is heavy in size and bulky in nature as same thing of BERT can be possible with distillbert. Review collected by and hosted on G2.com.

Verified User in Information Technology and Services
UI
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Verified Current User
Review source: G2 invite
Incentivized Review
(Original )Information
What do you like best about BERT?

* BERT generates high-quality texts by understanding the context around a word. I found good performance on document retrieval, and Question Answering.

* Finetuning BERT on custom data (or transfer learning) is very simple and gives good results. BERT inference is also faster than GPT.

* BERT has an extensive community and good support. Almost everyone around me has used BERT. Review collected by and hosted on G2.com.

What do you dislike about BERT?

In my experience with BERT, I think it still needs improvements:

* I found that BERT fine-tuning does not work well with large-scale datasets (e.g PILE)

* Its domain knowledge is constricted. It does not know much about domains such as healthcare, and education.

Hence, BERT can be considered enough for simple tasks, however, for complex tasks (e.g. open-ended generation, language translation etc.), it needs improvement.

I trust it's newer version will accommodate for major fixes. Wish them luck, Review collected by and hosted on G2.com.

Ojasi K.
OK
AI Engineering Analyst
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Worked on a use-case for detecting toxicity in prompts and their respective completions. BERT worked effectively providing us a very high accuracy of upto 92% for correct detections. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Can try including more classes except for the 6: toxic, severe-toxic, obscene, threat, insult and identity_hate. Some useful recommended classes: gender_bias, ethnicity_bias etc. Review collected by and hosted on G2.com.