Releases: pwgit-create/APPWISH_OLLAMA
AppWish 2.1.1
News
Made a minor fix. If you need more information, check out this PR.
To access information on starting Appwish or setting the external libraries functions to on, please refer to the previous release notes.
In case you require assistance in getting AppWish Amd x64 running on WSL on your machine, refer to this thread in the discussion forum:
If your Nvidia card is good, I suggest opting for WSL over VM. In that scenario, your GPU will be utilized by WSL, which will result in a faster Ollama.
Commands for Appwish2
To view the commands that are available for this release, click here.
AppWish 2.1
AppWish 2.1
News
• A feature has been added that allows you to place external libraries (jars) in a designated folder for the purpose of compiling more advanced applications. To enhance security, the default mode is set to off.
Please note that you need to set USE_EXTERNAL_LIBRARIES from off to "on" to enable external libraries.
Ollama-config
USE_EXTERNAL_LIBRARIES=on
• The installation script has been improved/fixed.
• In the AI output, there should be fewer references to code comments when continuing an existing application.
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
appwish2
Linux x64 amd
Model: codestral:22b-v0.1-q5_0
Java: In case Java 19 isn't already installed, it'll be installed using apt.
Raspberry Pie 5
Model: gemma2:2b
Java: JDK21
Comment: Our only available Java option for the Raspberry OS is Java 17, so we use bellsoft-jdk21.0.4+9-linux-aarch64.deb instead
Java 19 is needed for the jar to run, but you can create code that runs on Java 21 (such as KEM Apps).
Considerations regarding the Raspberry Pie 5 version:
The model that we use is not as good as CodeStral for code base generation, but it's awesome for generating single apps. I can run it flawlessly on Raspberry Pie 5 with 8gb of ram :) Dont except as good results as you would have with AMD version running on wsl on a decent pc though 😉
Proper cooling:
It would be great if you could review this video and post about cooling tips for a Raspberry Pie 5. The instructions on proper cooling for your Raspberry Pi 5 are essential since you will be running your own AI server on it. Running your own AI server is advantageous because you can generate your code offline. It's easy to set everything up because the install script installs everything for you.
There are default external libraries included
To use external_libraries, please ensure that external_libraries are turned on (see the text on top of this release).
There are three libraries included by default. At the mvn repository, the licenses can be viewed.
https://mvnrepository.com/artifact/com.google.code.gson/gson/2.4
https://mvnrepository.com/artifact/org.json/json
https://mvnrepository.com/artifact/org.seleniumhq.selenium/selenium-java
This is always included in the installation script, no matter if it's for the arm or x64 editions
- Ollama
- Curl (it's necessary for installing Ollama)
Note: If you already have Java or Curl on your system, the install script won't install them. By running the install script again, it is also possible to update the ollama version. Even if you have already installed Ollama or AppWish, you can still do this and it won't cause any issues.
What's Changed
- Sync branches by @pwgit-create in #191
- External libs by @pwgit-create in #192
- Fixed running apps with new classpath and removed unused imports by @pwgit-create in #193
- Fix existing code issue by @pwgit-create in #194
- Version 2.1 by @pwgit-create in #195
Full Changelog: v2.0.5...v2.1
AppWish 2.0.5
AppWish 2.0.5
News
- Redesign to a minimalistic GUI that looks more professional.
- CodeGeex4 model now has a fix for the class order during code base generation (For further details, refer to #184).
- You can find pictures of the new release in the discussion forum.
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
appwish2
The install script for Linux x64 version includes
Model: codestral:22b-v0.1-q5_0
Java: In case Java 19 isn't already installed, it'll be installed using apt.
Raspberry Pie 5 installation script includes
Model: codegeex4:latest
Java: In case Java JDK21 isn't already installed, it'll be installed using dpkg. As a result, there will be no more difficulties in manually setting up Java 19+ versions for those who are not familiar with Java.
Comment: We use bellsoft-jdk21.0.4+9-linux-aarch64.deb since Java 19 is not available from apt on the Raspberry OS right now.
Java 19 is the only requirement for running the jar, but you can create code that runs on Java 21 (such as KEM apps).
Whether it's for the arm or x64 editions, it's always included
- Ollama (If Ollama has already been installed, but there is an updated version, it will download it for you)
- Curl (If it's not already installed (it's necessary for installing Ollama)
If you've previously installed AppWish, you'll need to run the installation script again.
Ensure that you run the installation script for every new appwish version. By doing that, the aliases will be set to the correct path and Ollama will be updated. Downloading the AI model used by Appwish to your system will also be ensured.
Details about the Raspberry Pi 5 version.
- The codegeex4:latest model is useful for creating and modifying individual apps, but it is not suitable for generating code bases on this version of AppWish. On raspberry pie 5 (8gb ram), it runs without any issues and is actually quite good!
- To prevent heat from building up while running AppWish2 on your device, remove the lid. This is of paramount importance!
If you're unsure how to optimize the temperature of your raspberry pie, please refer to the video and post on my LinkedIn.
🐲 🔮 🌌
Commands / Aliases
- appwish2
- change-model
- ollama-config
- appwish-applications
- appwish-version
- appwish-path
Helper script for Windows Subsystem for Linux (WSL)
This script is designed to help you run Appwish Ollama using WSL (wsl_helper_script.sh).
Running this script is not recommended if you have no intention of using WSL.
You are given a message by Pwgit-create
Please feel free to contact me at snow_900@outlook.com if you have any questions.
Remember to have a good time and think outside the box!
What's Changed
- Gui branch by @pwgit-create in #183
- Fixed so the codebase generations with codegeex4 ai model will work b… by @pwgit-create in #184
- Im branch by @pwgit-create in #186
- sync develop branch by @pwgit-create in #188
- Appwish Enterprise 2.0.5. PR to Master Branch by @pwgit-create in #187
Full Changelog: v2.0.4...v2.0.5
AppWish 2.0.4
AppWish 2.0.4
News
-
The version of Ollama4j dependency in AppWish has been changed to 1.0.79. More customization options are available for submitting requests to the Ollama API in this version, including the ability to disable the template format for the query and various optimizations. I am grateful to a great developer, amithkoujalgi for his dedication and hard work in developing and maintaining Ollama4j.
-
A minor fix has been made to the GUI.
-
The release once again includes Raspberry Pie 5. 💫
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
appwish2
The install script for Linux x64 version includes
Model: codestral:22b-v0.1-q5_0
Java: In case Java 19 isn't already installed, it'll be installed using apt.
Raspberry Pie 5 installation script includes
Model: codegeex4:latest
Java: In case Java JDK21 isn't already installed, it'll be installed using dpkg. As a result, there will be no more difficulties in manually setting up Java 19+ versions for those who are not familiar with Java.
Comment: We use bellsoft-jdk21.0.4+9-linux-aarch64.deb since Java 19 is not available from apt on the Raspberry OS right now.
Java 19 is the only requirement for running the jar, but you can create code that runs on Java 21 (such as KEM apps).
Whether it's for the arm or x64 editions, it's always included
- Ollama (If Ollama has already been installed, but there is an updated version, it will download it for you)
- Curl (If it's not already installed (it's necessary for installing Ollama)
If you've previously installed AppWish, you'll need to run the installation script again.
Ensure that you run the installation script for every new appwish version. By doing that, the aliases will be set to the correct path and Ollama will be updated. Downloading the AI model used by Appwish to your system will also be ensured.
Details about the Raspberry Pi 5 version.
- The codegeex4:latest model is useful for creating and modifying individual apps, but it is not suitable for generating code bases on this version of AppWish. On raspberry pie 5 (8gb ram), it runs without any issues and is actually quite good!
- To prevent heat from building up while running AppWish2 on your device, remove the lid. This is of paramount importance!
If you're unsure how to optimize the temperature of your raspberry pie, please refer to the video and post on my LinkedIn.
🐲 🔮 🌌
Commands / Aliases
- appwish2
- change-model
- ollama-config
- appwish-applications
- appwish-version
- appwish-path
Helper script for Windows Subsystem for Linux (WSL)
This script is designed to help you run Appwish Ollama using WSL (wsl_helper_script.sh).
Running this script is not recommended if you have no intention of using WSL.
You are given a message by Pwgit-create
Please feel free to contact me at snow_900@outlook.com if you have any questions.
Remember to have a good time and think outside the box!
What's Changed
- Sync by @pwgit-create in #178
- Ollama4j dependency 1.0.79 by @pwgit-create in #179
- Layout fix by @pwgit-create in #180
- release 2.0.4 by @pwgit-create in #181
Full Changelog: v2.0.3...v2.0.4
AppWish 2.0.3
AppWish 2.0.3
News
-
Code base generation will be optimized through various fixes and improvements.
-
A GUI fix has been added as well.
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
appwish2
The install script information is still the same (nothing new)
-
Java 19 will be installed for you (if you don't already have it) and also Ollama itself. Ollama is installed by nesting their official script which can be seen at https://ollama.com/download/linux
-
The installation script will take care of downloading the AI model used by AppWish for you, making it a breeze 😘
-
Don't worry about hard configurations, spyware, or other trash add-ons. This is a clean project that respects integrity more than anything else. I'm sharing this with you to brighten the world and make the sun shine again 🌞
Helper script for Windows Subsystem for Linux (WSL)
If you possess a decent Nvidia GPU, utilize the wsl and execute the helper script to activate the wsl GUI. That is the only way to get good speeds unless you run Linux as your primary machine. Let me know if you're having trouble running this on WSL and I'll help you.
🐲 🔮 🌌
Commands / Aliases
- appwish2
- change-model
- ollama-config
- appwish-applications
- appwish-version
- appwish-path
By writing them in your terminal, you can use them.
AppWish 2.0
AppWish 2.0.2
News
Appwish enterprise is now open for all!
AppWish Ollama
-
Keeping the original app in its original location, the Continue an app function will place continued apps in a designated folder. This is advantageous because you won't lose good Java code just because of an incorrect code edit.
-
To put it simply, you have backups for your apps when you use the button for continue an application.
Install AppWish Ollama
The Install script will include 6 commands (aliases) for you. 🥇
./install_ollama.sh
Start Appwish Ollama
appwish2
The install script information is still the same (nothing new)
-
Java 19 will be installed for you (if you don't already have it) and also Ollama itself. Ollama is installed by nesting their official script which can be seen at https://ollama.com/download/linux
-
The installation script will take care of downloading the AI model used by AppWish for you, making it a breeze 😘
-
Don't worry about hard configurations, spyware, or other trash add-ons. This is a clean project that respects integrity more than anything else. I'm sharing this with you to brighten the world and make the sun shine again 🌞
Helper script for Windows Subsystem for Linux (WSL)
If you possess a decent Nvidia GPU, utilize the wsl and execute the helper script to activate the wsl GUI. That is the only way to get good speeds unless you run Linux as your primary machine. Let me know if you're having trouble running this on WSL and I'll help you.
🐲 🔮 🌌
Commands / Aliases
- appwish2
- change-model
- ollama-config
- appwish-applications
- appwish-version
- appwish-path
By writing them in your terminal, you can use them.
Appwish Going slow? Try this:
It's important to remember that if you lower this property too much, code base generation may take a turn for the worse.
- run command : ollama-config
- Change NUM_CTX=10240 into NUM_CTX="2048
AppWish 2.0
AppWish 2.0.1
News
AppWish Ollama
- Fix for [Code base Class Question]
Release File / Archive
- Two more aliases to the release install file
- When a new release version is launched, old appwish aliases are replaced with new ones.
Install AppWish Ollama
The Install script will include 6 commands (aliases) for you. 🥇
./install_ollama.sh
Start Appwish Ollama
appwish2
Appwish Enterprise
Appwish enterprise is now open for all!
Helper script for Windows Subsystem for Linux (WSL)
If you possess a decent Nvidia GPU, utilize the wsl and execute the helper script to activate the wsl GUI. That is the only way to get good speeds unless you run Linux as your primary machine. Let me know if you're having trouble running this on WSL and I'll help you.
🐲 🔮 🌌
Commands / Aliases
- appwish2
- change-model
- ollama-config
- appwish-applications
- appwish-version
- appwish-path
By writing them in your terminal, you can use them.
AppWish 2.0
News
- Code base creation is the new functionality for V 2.0
- :)
Install AppWish Ollama
The Install script will include 4 commands (aliases) for you. 🥇
./install_ollama.sh
Start Appwish Ollama
appwish2
Appwish Enterprise
Appwish enterprise is now open for all!
Helper script for Windows Subsystem for Linux (WSL)
If you possess a decent Nvidia GPU, utilize the wsl and execute the helper script to activate the wsl GUI. That is the only way to get good speeds unless you run Linux as your primary machine. Let me know if you're having trouble running this on WSL and I'll help you.
🐲 🔮 🌌
Commands / Aliases
- appwish2
- change-model
- ollama-config
- appwish-applications
By writing them in your terminal, you can use them.
Wish For A Star
Wish For A Star
Release Notes 1.8.1
- Updated 3th part libraries
- Updates have been applied to the libraries for the third part.
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
sudo java -jar appwish.jar
Appwish Enterprise
Appwish enterprise is a version of appwish that has the ability to create code bases with simple user input. To purchase this, contact me and I'll send you a JAR file.
Llama3 is back
The codestral:22b model has been modified to Llama3 to facilitate the use of Appwish by more individuals. However, the Codestral model is superior and I use it. If you have the hardware for it, use WSL and Nvidia GPUs and you will see.
How can I change the model to codestral:22b for the Linux AMD X64 version?
-
Run the install script that installed ollama before typing
ollama pull codestral:22b
in your terminal. -
Edit the text in the file using the path:
src/main/resources/ollama_model.props
From
MODEL_NAME=llama3:latest
Into
MODEL_NAME=codestral:22b
Helper script for Windows Subsystem for Linux (WSL)
This script is designed to help you run Appwish Ollama using WSL (wsl_helper_script.sh).
If you have a decent Nvidia GPU, you can run NvidiaCUDA with WSL without much setup. If you're looking for a very fast app generation, this option is a good choice.
Running this script is not recommended if you have no intention of using WSL.
🐲 🔮 🌌
Wish For A Star
Release Notes 1.8
- During code generation, disable the push on the history button
- Generated Java app class files can be deleted now.
- Improved the GUI's visual appeal.
Install AppWish Ollama
./install_ollama.sh
Start Appwish Ollama
sudo java -jar appwish.jar
Llama3 is back
The codestral:22b model has been modified to Llama3 to facilitate the use of Appwish by more individuals. However, the Codestral model is superior and I use it. If you have the hardware for it, use WSL and Nvidia GPUs and you will see.
How can I change the model to codestral:22b for the Linux AMD X64 version?
-
Run the install script that installed ollama before typing
ollama pull codestral:22b
in your terminal. -
Edit the text in the file using the path:
src/main/resources/ollama_model.props
From
MODEL_NAME=llama3:latest
Into
MODEL_NAME=codestral:22b
Helper script for Windows Subsystem for Linux (WSL)
This script is designed to help you run Appwish Ollama using WSL (wsl_helper_script.sh).
If you have a decent Nvidia GPU, you can run NvidiaCUDA with WSL without much setup. If you're looking for a very fast app generation, this option is a good choice.
Running this script is not recommended if you have no intention of using WSL.
🐲 🔮 🌌
What's Changed
- synd branches by @pwgit-create in #143
- Opti3 by @pwgit-create in #144
- Develop by @pwgit-create in #145
Full Changelog: v1.7.3...v1.8