/g/ - Technology

Board dedicated to discussions related to everyday Technology!

New Boards Added

>>>/int/ >>>/out/ >>>/wp/ >>>/ck/ >>>/k/ >>>/fa/

Bharatchan September 2025 UpdateHello everyone, it's been a while. Here we are wit...
[View Noticeboard]
0/4000

BharatChan Disclaimer

Notice

Before proceeding, please read and understand the following:

1. BharatChan is a user-generated content platform. The site owners do not claim responsibility for posts made by users.

2. By accessing this website, you acknowledge that content may not be suitable for all audiences.

3. You must follow BharatChan’s community guidelines and rules. Failure to do so may result in a ban.

4. By using BharatChan users agree to the use of cookies, mostly for session related to user.

A poster on BharatChan must abide by the following rules:

Sitewide Rules
You must be 18 or older to post.
Sharing personal details or engaging in doxing is strictly prohibited.
Political discussions should be confined to /pol/.
Off-topic discussions, thread derailment, or spam may result in a ban and IP blacklist.
Pornographic content is strictly prohibited.
Any activity violating local laws is not allowed.
If you are not an Indian, you can only post in /int/. Or create and account and ask for approval to post in other boards.
Acknowledge

Recent Posts

Gemini+Figma for UI

View

/emacs/ general

View

Simple Linux General /slg/ - Useful Commands editi...

View

Jio Fiber

View

LEARNING SPREADSHEETS

View

View

View

View

My Biggest Project (till date)

View

View

AI image gen

View

Zerodha just donated 100,000 usd to FFMPEG

View

ITS HAPPENING

View

CAN SOMEONE EXPLAIN ME HOW THESE JOB DESCRIPTIONS ...

View

the best android browser has arrived!!

View

/compiler_develoment/ thread

View

Pajeet doval

View

/g/ - Laptops

View

Esp32 Jammer

View

Help me move away from winblows

View

दोमेन

View

View

the hmd touch 4g

View

View

I am done. It's over.

View

Jokes write themselves

View

AWS outage

View

View

View

took them long enough

View

just installed Arch Linux bros

View

Where to apply

View

View

Is the battery of my laptop dead?

View

OpenCL

View

Where are we heading towards?

View

does this ever end?

View

Zoho appreciation thread

View

View

new discord server for pair-programming

View

Looking to Buy a Decent Gaming Laptop

View

View

View

View

which llm subscription is the best

View

RSS feed for Bharatchan ?

View

Sketch - A simple 2D graphics library in C

View

View

View

View

AI = Actually Indonesians

Anonymous

IN

NBVf7m

No.709

It's was pinoys but same thing tbh

CEO is getting charged for defrauding his investor by using humans while claiming it was AI.

Anonymous

IN

AFZxCh

No.710

reply to my thread of running llm locally.

Anonymous

IN

NBVf7m

No.711

>>710

you are not gonna be able to run 67billion parameter model on any laptop (i am not sure about macbooks).

Get rtx 3060, 4060 with highest ram you can manage and then maybe you have chance.

However smaller models 7billion param can in theory be run.

I don't do it, another anon did create a thread about it so find that thread ask him desu.

On that, good idea we should probably start a general dedicated for local models what do you say sirs?

Start by grabbing bits of info from 4chan /g/ and LocalLLaMA

On /g/ there's /aicg/ and /lmg/

Duck

IN

7zK+rK

No.712

>>710

Just run them in google collab. Laptop pe nahi ho payega.

Anonymous

IN

AFZxCh

No.713

>>711

>>712

yes, yaar, i think some MLfag here should make a dedicated thread to ML which has resources and discussion on ML about new models and research papers.

Anonymous

IN

AFZxCh

No.714

>>712

I want to actually use it for daily tasks.

https://vicharak.in/axon

I found this on twitter saying that it can run models. if any anon knows kindly elaborate what this thing is.

Anonymous

IN

NBVf7m

No.715

>>714

>https://vicharak.in/axon

I have been interesting about them for a while, maybe i will get one of their products in future.

I think starting with 6k or somethign not bad.

Duck

IN

7zK+rK

No.716

>>714

>18k rupees

>8gb DDR4

I won't run anything your laptop can't already run.

Duck

IN

7zK+rK

No.717

Seems more like a buffed of raspberry pi, instead of device for running AI models.

Anonymous

IN

NBVf7m

No.718

>>716

Not sure about price, but there's one product which is like raspberry but not the only one.

Anonymous

IN

NBVf7m

No.719

>>717

yeah

Anonymous

IN

NBVf7m

No.720

if you have rtx 3050, decent chunk of ram - i have like 64gb and decent processors you can run many llms with around 7bn parameter or so.

maybe some image generation models like stable diffusion etc.

Anonymous

IN

AFZxCh

No.721

>>720

well, i have 16GB DDR4, i5 11th gen and rtx 2050 4gb Vram. what can i even do in this? also i think my gpu will work better for anything like this.

Anonymous

IN

AFZxCh

No.722

>>720

>Simpler or less detailed responses: While a 10B model can handle many tasks very well, it may sometimes lack the richer, more layered answers that come from a model with hundreds of billions of parameters.

>Reduced performance on very complex or highly technical queries: The distilled model might occasionally struggle with the most challenging reasoning tasks compared to its larger counterpart.

what do you think about this or your experience on this?

Anonymous

IN

NBVf7m

No.723

>>721

try 7bn parameter models - mistral, deepseek, meta all of them probably have those.

iirc deepseek has 1bn parameter model but it's mostly useless

Anonymous

IN

NBVf7m

No.724

>>722

>what do you think about this or your experience on this?

Give me this week i have decent setup, i can run these models.

I will see some llms and some image gen ones if it works out i will write about it. Probably kickstart a general.

Anonymous

IN

AFZxCh

No.725

>>723

alright, should i do it on ubuntu or windows? i have 40 Gb left on ubuntu so asking for that only.

Anonymous

IN

NBVf7m

No.726

>>725

loonix, cuda is more optimized for it and updated

Anonymous

IN

NBVf7m

No.727

I have separate slightly older setup, if i can run 7bn models on it then ig you will be breeze.

I will test there first. It has limitations same as yours, like 16gb ram etc. etc. Graphics card a bit worse than yours.

Active Users in /g/: N/A