fabiochiu commited on
Commit
b69a5e3
·
1 Parent(s): 7b6462c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +58 -0
README.md CHANGED
@@ -13,6 +13,64 @@ probably proofread and complete it, then remove this comment. -->
13
 
14
  This model is [t5-base](https://huggingface.co/t5-base) fine-tuned on the [190k Medium Articles](https://www.kaggle.com/datasets/fabiochiusano/medium-articles) dataset for predicting article titles using the article textual content as input.
15
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
16
  ## Training and evaluation data
17
 
18
  The model has been trained on a single epoch spanning about 16000 articles, evaluating on 1000 random articles not used during training.
 
13
 
14
  This model is [t5-base](https://huggingface.co/t5-base) fine-tuned on the [190k Medium Articles](https://www.kaggle.com/datasets/fabiochiusano/medium-articles) dataset for predicting article titles using the article textual content as input.
15
 
16
+ # How to use the model
17
+
18
+ ```
19
+ from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
20
+ import nltk
21
+ nltk.download('punkt')
22
+
23
+ tokenizer = AutoTokenizer.from_pretrained("fabiochiu/t5-small-medium-title-generation")
24
+ model = AutoModelForSeq2SeqLM.from_pretrained("fabiochiu/t5-small-medium-title-generation")
25
+
26
+ text = """
27
+ Many financial institutions started building conversational AI, prior to the Covid19
28
+ pandemic, as part of a digital transformation initiative. These initial solutions
29
+ were high profile, highly personalized virtual assistants — like the Erica chatbot
30
+ from Bank of America. As the pandemic hit, the need changed as contact centers were
31
+ under increased pressures. As Cathal McGloin of ServisBOT explains in “how it started,
32
+ and how it is going,” financial institutions were looking for ways to automate
33
+ solutions to help get back to “normal” levels of customer service. This resulted
34
+ in a change from the “future of conversational AI” to a real tactical assistant
35
+ that can help in customer service. Haritha Dev of Wells Fargo, saw a similar trend.
36
+ Banks were originally looking to conversational AI as part of digital transformation
37
+ to keep up with the times. However, with the pandemic, it has been more about
38
+ customer retention and customer satisfaction. In addition, new use cases came about
39
+ as a result of Covid-19 that accelerated adoption of conversational AI. As Vinita
40
+ Kumar of Deloitte points out, banks were dealing with an influx of calls about new
41
+ concerns, like questions around the Paycheck Protection Program (PPP) loans. This
42
+ resulted in an increase in volume, without enough agents to assist customers, and
43
+ tipped the scale to incorporate conversational AI. When choosing initial use cases
44
+ to support, financial institutions often start with high volume, low complexity
45
+ tasks. For example, password resets, checking account balances, or checking the
46
+ status of a transaction, as Vinita points out. From there, the use cases can evolve
47
+ as the banks get more mature in developing conversational AI, and as the customers
48
+ become more engaged with the solutions. Cathal indicates another good way for banks
49
+ to start is looking at use cases that are a pain point, and also do not require a
50
+ lot of IT support. Some financial institutions may have a multi-year technology
51
+ roadmap, which can make it harder to get a new service started. A simple chatbot
52
+ for document collection in an onboarding process can result in high engagement,
53
+ and a high return on investment. For example, Cathal has a banking customer that
54
+ implemented a chatbot to capture a driver’s license to be used in the verification
55
+ process of adding an additional user to an account — it has over 85% engagement
56
+ with high satisfaction. An interesting use case Haritha discovered involved
57
+ educating customers on financial matters. People feel more comfortable asking a
58
+ chatbot what might be considered a “dumb” question, as the chatbot is less judgmental.
59
+ Users can be more ambiguous with their questions as well, not knowing the right
60
+ words to use, as chatbot can help narrow things down.
61
+ """
62
+
63
+ inputs = ["summarize: " + text]
64
+
65
+ inputs = tokenizer(inputs, max_length=max_input_length, truncation=True, return_tensors="pt")
66
+ output = model.generate(**inputs, num_beams=8, do_sample=True, min_length=10, max_length=64)
67
+ decoded_output = tokenizer.batch_decode(output, skip_special_tokens=True)[0]
68
+ predicted_title = nltk.sent_tokenize(decoded_output.strip())[0]
69
+
70
+ print(predicted_title)
71
+ # Conversational AI: The Future of Customer Service
72
+ ```
73
+
74
  ## Training and evaluation data
75
 
76
  The model has been trained on a single epoch spanning about 16000 articles, evaluating on 1000 random articles not used during training.