Updating the CUDA instruction
Browse filesAdded example code to use GPU (CUDA) instead of CPU.
README.md
CHANGED
|
@@ -241,6 +241,15 @@ from nemo.collections.speechlm2.models import SALM
|
|
| 241 |
model = SALM.from_pretrained('nvidia/canary-qwen-2.5b')
|
| 242 |
```
|
| 243 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 244 |
## Input:
|
| 245 |
|
| 246 |
**Input Type(s):** Audio, text prompt <br>
|
|
|
|
| 241 |
model = SALM.from_pretrained('nvidia/canary-qwen-2.5b')
|
| 242 |
```
|
| 243 |
|
| 244 |
+
```python
|
| 245 |
+
#To run with CUDA
|
| 246 |
+
from nemo.collections.speechlm2.models import SALM
|
| 247 |
+
import torch
|
| 248 |
+
device = torch.device("cuda")
|
| 249 |
+
|
| 250 |
+
model = SALM.from_pretrained('nvidia/canary-qwen-2.5b').bfloat16().eval().to(device)
|
| 251 |
+
```
|
| 252 |
+
|
| 253 |
## Input:
|
| 254 |
|
| 255 |
**Input Type(s):** Audio, text prompt <br>
|