Model Training on AMD 16-core CPU with 8GB RAM running in a virtual machine for Bitcoin Price Prediction – Part 2 – Updated

Continuing with Over 500,000+ Data Points for Bitcoin (BTC) Price Prediction

Using the Python program, the first method I tried was SVR (Support Vector Regression) for prediction. However… how many steps should I use for prediction? 🤔

Previously, I used a Raspberry Pi 4B (4GB RAM) for prediction, and… OH… 😩
I don’t even want to count the time again. Just imagine training a new model on a Raspberry Pi!

So, I switched to an AMD 16-core CPU with 8GB RAM running in a virtual machine to perform the prediction.

  • 60 steps calculation: Took 7 hours 😵
  • 120 steps: …Man… still running after 20 hours! 😫 Finally !!! 33 Hours

Do I need an M4 machine for this? 💻⚡

ChatGPT provided another approach.
OK, let’s test it… I’ll let you know how it goes! 🚀

🧪 Quick Example of More Time Steps Effect

Time Step (X Length)Predicted AccuracyNotes
30⭐⭐⭐Quick but less accurate for long-term trends.
60⭐⭐⭐⭐Balanced context and performance.
120⭐⭐⭐⭐½Better for long-term trends but slower.
240⭐⭐Risk of overfitting and slower training.

#SVR #Prediction #Computing #AI #Step #ChatGPT #Python #Bitcoin #crypto #Cryptocurrency #trading #price #virtualmachine #vm #raspberrypi #ram #CPU #CUDB #AMD #Nvidia

Model Training Using TensorFlow on Raspberry Pi 4B (4GB RAM) for Bitcoin Price Prediction

The development of a CRYPTO gaming system https://www.cryptogeemu.com/ has been ongoing for around two years. What does it actually do? Well… just for fun!

The system captures data from several major crypto market sites to fetch the latest price list every minute. It then calculates the average values to determine the price. Users can create a new account and are given a default balance of $10,000 USD to buy and sell crypto—but there’s no actual real-market trading.

The Thought Process

Suddenly, I started wondering:
How can I use this kind of historical data? Can I make a prediction?

So, I simply asked ChatGPT about my idea. I shared the data structure and inquired about how to perform predictions.

ChatGPT first suggested using Linear Regression for calculations. However, the predicted values had a large difference compared to the next actual data point.

Next, it introduced me to the Long Short-Term Memory (LSTM) method for training under the TensorFlow library.

I fed 514,709 lines of BTC price data into the training program on a Raspberry Pi 4B (4GB RAM).
The first run took 7 hours to complete the model !!!!!!!!!!!!!!!!!

But the result… um… 😐

I’m currently running the second round of training. I’ll update you all soon!

Sample Data:

YYYY/MM/DD-hh:mm:ss  Price  
2025/02/17-20:06:09 95567.20707189501
2025/02/17-20:07:07 95582.896334665

P.S.: I’m not great at math. 😅

#BTC #Bitcoin #TensorFlow #AI #CryptoGeemu #RaspberryPi #Training #Crypto #ChatGPT #LinearRegression #LSTM #LongShortTermMemory

AI Network Operator – under Deepseek case

We all know how successful Deepseek has been in recent months. It demonstrates that a low-processing-power, CPU-based AI is possible. Adopting this type of AI anywhere, including IoT devices or even routers, could be feasible.

Cisco, Juniper, Arista, and other network device manufacturers already produce hardware with high processing power. Some of these devices run Linux- or Unix-based platforms, allowing libraries and packages to be installed on the system. If that’s the case, can AI run on them?

Based on Deepseek’s case, tests have shown that an ARM Linux-based Raspberry Pi can successfully run AI. Although the response time may not meet business requirements, it still functions.

Running AI on a router (perhaps within the control plane?) could enable AI to control and modify router configurations. (Skynet? Terminator?) But then, would the AI become uncontrollable?

There are several key questions to consider:

  1. What can AI do on routers and firewall devices?
  2. Can AI self-learn the network environment and take further control?
  3. Can AI troubleshoot operational issues?

It seems like an interesting topic for further research. However, before diving deeper, teaching AI about network operations should no longer be a major concern.

Paragraph proofreading by #ChatGPT

AI Picture generated by #CANVA

#AI #Network #internet #networkoperation #operation #IP #Router #RaspberryPI #PI #Cisco #Juniper #Arista #opensource #BGP #routing

Install and Run OLLAMA on Linux Machine

So many tech guys already share how to install the OLLAM. I wont say too details. Just a brief step for you.

  1. Prepare a Machine with good GPU, CPU and > 16G RAM. (Raspberry Pi can run with the Deepseek 1.5B, other………. Please chec my last post)
  2. Install update your linux repos.
    sudo apt-get update -y
    sudo apt-get upgrade -y

  3. Install Ollam by the follow command
    curl -fsSL https://ollama.com/install.sh | sh
  4. Run the LLM model, if you wont have the model at your machine, it will be download automatically.
    ollama run <model>
    e.g.: ollama run deepseek-r1:8b

  5. The model will be downloaded to /usr/share/ollama/.ollama/models/
  6. what model you can run? Check here
    https://ollama.com/search
  7. OLLAMA command line is a little bit similar with Docker, check this.

PS: You also can install OLLAMA at WINDOWS, please also check OLLAMA website.

Lets try your own AI locally!

#OLLAMA #Model #AI #CPU #GPU #CUDB #RAM #RaspberryPI #Docker

Raspberry Pi 4b 4G Ram with Deepseek 8b….

Answer is ……. Fail………..

I cannot run Deepseek 8b on my Raspbeery Pi 4b 4G Ram version…….

OLLAMA keeps loading and crash and load again….. Suspect the storage full…..

Anyone?

2025-02-06

Hooray!!! Model directory MAP to share drive.

ERROR!!!!!!!! Not enough Memory!!!!!!!!!!

#AI #deepseek #raspbeerypi #R1-8B #memory #ram #ERROR

Deepseek 1.5b vs 8b version

Well, we all expect that 1.5b and 8b may have a different of AI’s knowledge.

We made a test,
1. 1.5b we are on Raspberry PI 4B 4G Ram.,
2. 8b on virtual machine with AMD Radeon and 16G Ram on Ubuntu.

We only ask a question.

“what is the difference between you and chatGPT”

  • 1. 1.5b versions

  • 2. 8b versions

The knowledge base of course 8b will be better. However, we will most concern of the resource usage. Can Raspberry PI CPU base can process this efficient?

#deepseek #AI #CPU #raspberrypi #GPU #nvidia #CUDB #AMD