Best Software for 2025 is now live!
Show rating breakdown
Save to My Lists
Unclaimed
Unclaimed

Top Rated BERT Alternatives

54 BERT Reviews

4.4 out of 5
The next elements are filters and will change the displayed results once they are selected.
Search reviews
Popular Mentions
The next elements are radio elements and sort the displayed results by the item selected and will update the results displayed.
Hide FiltersMore Filters
The next elements are filters and will change the displayed results once they are selected.
The next elements are filters and will change the displayed results once they are selected.
54 BERT Reviews
4.4 out of 5
54 BERT Reviews
4.4 out of 5

BERT Pros and Cons

How are these determined?Information
Pros and Cons are compiled from review feedback and grouped into themes to provide an easy-to-understand summary of user reviews.
Pros
Cons
G2 reviews are authentic and verified.
YP
Software Testing Expert
Information Technology and Services
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

We are using BERT for personalized marketing campaigns to address the customer concern or questions about good and services, which improve customer services.

In order to send customer emails with content that is pertinent to their interests or to target customer with ads for products they are likely to be interest in. Review collected by and hosted on G2.com.

What do you dislike about BERT?

I don't think there are drawbacks of it but certainly at the first I encountered issues like the fact that BERT is a blackbox model which makes it difficult to always understand how it generate its predictions.

The cost of computing was high for both deployment and training, as a result it might be difficult to understand why BERT makes certain predicts.

Biases can be their as the large dataset is used to train BERT. Review collected by and hosted on G2.com.

Saptarshi B.
SB
Data Scientist
Enterprise(> 1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Very easy to implement and can be integrated with any type of downstream task. The Huggingface implementation contains a lot of support and and very well documented. A several number of features can also be fed into the network with intelligent architectural design. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Sometimes it becomes very hard to comprehend the output of BERT and it loses the interpretebility. Review collected by and hosted on G2.com.

PR
Senior Software Developer
Information Technology and Services
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

We use BERT for our in-premises use , it is an open source by Google which comes with a great features and allows us a developer to build upon and adapt the model for their specific needs.

Designed to understand the context of a word in a senstence by bidirectional modelling. It improved our NLP task like questioning , sentiment analysis.

Its pretrained model is adaptable and useful for a variety of applications because it can fine tuned on particular task with only modest amounts of task specific data. Review collected by and hosted on G2.com.

What do you dislike about BERT?

Its complexity ca require significant computational power, limiting accessibility for some smaller projects. Additionally, it excels at understanding work relationship, it sometimes struggle with longer document where braoder context is needed. Review collected by and hosted on G2.com.

SS
Senior Content Editor
Information Technology and Services
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

BERT strickes me as very potent and adaptable NLP model. i appreciate how it can learn more complex representation of meaning by taking into account both the left and right context of word. It's flexibility and ability to customise for a particular tasks are both appealing to me. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It takes a lot of computational power to train and use which was initially a challenge for us to utilise it

It is trained on a sizeable text datasheet that might be biased. The accuracy of predictions may be sensitive to the biases in the training data for tasks like natural language inference and question answering. Review collected by and hosted on G2.com.

Sandeep G.
SG
AI Developer
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

BERT is a Transformodel,It has a encoding architecture.I have worked different arechitecture of Transoformer model But BERT architecture perfomed well comapare to other architectures.It is easy to train with custom data and it is easy to understand. Review collected by and hosted on G2.com.

What do you dislike about BERT?

BERT model computationally costly and model size is very large. It took lot of time for inference and training on cpu machine. Review collected by and hosted on G2.com.

NG
Software Developer
Information Technology and Services
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

It is open source. Works with laguages like Hindi, Spanish, Arabic, Portugeese, German, etc. Used in google serach engine for best result. It reads sentences from both ends. Identifies relation between words very well. Based on python language which is very easy. Review collected by and hosted on G2.com.

What do you dislike about BERT?

more memory requirement. does not update / enhace very frequently. Review collected by and hosted on G2.com.

apurv t.
AT
Associate Planning and Scheduling
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

Its is an open source LLM. Already trained on large corpus of words. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It takes time to train the model.It is trained on General data, so if you are using it for some specific purpose you might have to use other LLM. Review collected by and hosted on G2.com.

Abhay S.
AS
Talent Solutions Specialist
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

BERT is good at understanding meaning of sentences be it complex, easy or ambigious.It has variety of use case like question and answering,understanding sentiment of the phrase (positive, negative or neutral).It is open source so we can use it for our own application and make better our offerings. Review collected by and hosted on G2.com.

What do you dislike about BERT?

BERT has sometimes been wrong on interpretation of regional languages or mixture of multiple languages in one phrase. Its predictions have been false manier times which can be improved for general texting. Review collected by and hosted on G2.com.

SM
Mid-Market(51-1000 emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

This tool understands my search intent in almost any language I use because it's well trained in 100+ languages. Also, it helps me with my longer and more conversational queries bybetter understanding words, or phrase context. Review collected by and hosted on G2.com.

What do you dislike about BERT?

I find this tool to be failed at using its knowledge or common-sense, as it just work on patterns of text data. It's hard to deploy large BERT models. This tool memorise sensitive data of training data that's why raises privacy concern. Review collected by and hosted on G2.com.

Modalavalasa S.
MS
Assistant System Engineer
Small-Business(50 or fewer emp.)
More Options
Validated Reviewer
Review source: G2 invite
Incentivized Review
What do you like best about BERT?

BERT(Bidirectional Encoder Representations from Transformer) achieved state of the art performance on a number of Natural Language understanding tasks such as Text Classification, Summarization and machine translation etc. It was pre-trained simultaneously on two tasks which is amazing. As it is an open source it helped me a lot to do my projects. Review collected by and hosted on G2.com.

What do you dislike about BERT?

It needs improvement as it gives based on the trained words only. It requires high Computational Resources and Fine Tuning on labelled data. Customer Support needs to be improved to increase the usage of BERT. Review collected by and hosted on G2.com.