# Refine

Create and refine an answer by sequentially going through each retrieved text chunk.

**Pros**: Good for more detailed answers

**Cons**: Separate LLM call per Node (can be expensive)

<figure><img src="https://662370747-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2F3VoWwSsyrEg0DEvIIjv9%2Fuploads%2Fgit-blob-fc10c2c7665417624a77dd70a9e2ae43eb21ff1a%2Fimage%20(5)%20(1)%20(1)%20(1)%20(1)%20(2)%20(1).png?alt=media" alt=""><figcaption></figcaption></figure>

**Refine Prompt**

```markup
The original query is as follows: {query}
We have provided an existing answer: {existingAnswer}
We have the opportunity to refine the existing answer (only if needed) with some more context below.
------------
{context}
------------
Given the new context, refine the original answer to better answer the query. If the context isn't useful, return the original answer.
Refined Answer:
```

**Text QA Prompt**

```
Context information is below.
---------------------
{context}
---------------------
Given the context information and not prior knowledge, answer the query.
Query: {query}
Answer:
```
