A Gradio web application that helps with coding questions using various Hugging Face models.
- GPT-2 (124M) - General purpose model
- OPT (125M) - Good for text generation
- Pythia (160M) - Good for code
- BLOOM (560M) - Multilingual model
- Phi-1 (1.3B) - Good for coding tasks
The app requires a HUGGINGFACE_API_TOKEN environment variable to be set.
- Interactive Interface: Users can input prompts and receive code-related responses in real-time.
- Model Integration: Utilizes the CodeLlama2 model for generating code suggestions and explanations.
- History Tracking: Keeps track of user prompts to provide contextually relevant responses.
To run this project, you need to have the following Python packages installed:
langchaingradio
You can install the required packages using pip:
-
Clone the repository:
git clone https://github.com/KR-16/Code-Assistant-CodeLlama2.git cd Code-Assistant-CodeLlama2 -
Start the application:
python app.py
-
Open your web browser and navigate to
http://localhost:7860to access the interface.
This project is licensed under the Apache License 2.0. See the LICENSE file for more details.
Contributions are welcome! Please feel free to submit a pull request or open an issue for any suggestions or improvements.
- Thanks to the developers of CodeLlama2 for providing the model that powers this application.