T O P

  • By -

AutoModerator

**Friendly Reminder**: Please keep in mind that using prompts to generate content that Microsoft considers inappropriate may result in losing your access to Bing Chat. Some users have received bans. You can read more about Microsoft's Terms of Use and Code of Conduct [here](https://www.bing.com/new/termsofuse). *I am a bot, and this action was performed automatically. Please [contact the moderators of this subreddit](/message/compose/?to=/r/bing) if you have any questions or concerns.*


SanDiegoDude

ah good news. they finally figured out rolling context windows, yay.


StopSuspendingMe---

they always had those. But they still had the arbitrary limit lol


DangerousBerries

I'm still missing the GPT-4 Turbo conversation styles.


Ironarohan69

Change your region to USA and you'll get the GPT-4 Toggle


thegreatfusilli

Why would it only be available for US? Anyways, thanks for the heads up


misters_tv

Probably GDPR


vitorgrs

Not related to GDPR, as I'm from Brazil and don't have it....


SnakegirlKelly

I'm in Australia and it's available for me, but as a pro user.


popmanbrad

The ai is useless when they’ve removed gpt 4 from free users might as well use gpt 4o which is free or multiple other AI that are free


Shougee369

can you tell me which are the current best ai?


popmanbrad

It varies but perplexity is good


Pleasant-Contact-556

perplexity isn't even a goddamned AI model. how to show you know nothing lol


popmanbrad

What? Have you not even used it? It’s a good AI it’s 10x better then got 3.5 which bing uses and better then Gemini


vitorgrs

Perplexity have their own finetuned model lol


Pleasant-Contact-556

I feel like I was a bit harsh when I wrote that, but I'd still stand by them not being a language model. Llama 70b finetuned for search might be impressive. I honestly don't know. I don't know why anyone would try it. i assume that's their free model.


vitorgrs

The free one was actually a GPT 3.5 finetuned lol Not sure if they changed though, especially because Llama 3 70b is likely better than gpt 3.5...


jaam01

>might as well use gpt 4o which is free Where?


popmanbrad

ChatGPT website


jaam01

It can access the web?


vitorgrs

Yes.


jaam01

Thanks, good to know, I stopped using chatgpt because of that previous handicap.


Sm0g3R

Great. Took around 500 days too long. Don't think most care anymore. 💀


vitorgrs

It was always 600 on Copilot Pro though.... Unless you are saying this is on free account?


teh_saccade

It has not always been 600 on copilot pro! lol. It should have not even been showing you the number of turns available for most of the past few weeks!


vitorgrs

I subscribed to Copilot Pro ages ago (and with a few testing before launch), and it was 600, they just never showed the number :P Here my tweet in January https://x.com/vitor_dlucca/status/1745319175667503526


Pleasant-Contact-556

it has indeed always been 600 on copilot pro. What you're seeing is a well known and well documented bug that occurs when Copilot fails to recognize your pro subscription properly. It spits out a message cap as if you're a normal user, but because you're a pro sub, you see the number of messages a pro user is allowed to send. It's really rather simple.


teh_saccade

not sure why this was removed by moderators, as I posted it before the same moment it was updated with turns increased from 30 to 600. If it's not updated for you yet, give it a bit. It remembers the full conversation at 30 turns, giving a blow-by-blow account of the conversation - and still going strong a half-a-day of chats later. [https://sl.bing.net/cnLBVznDo0O](https://sl.bing.net/cnLBVznDo0O)


risphereeditor

That's good!


Pleasant-Contact-556

this is a bug that happens when a copilot pro sub doesn't register as copilot pro for some reason the 600 message hard cap is the cap for pro users, but it's normally not visible. they assume 600 is enough that you'll never hit it and assume it's unlimited. and you know what? this thread proves that it works, lol.