没有合适的资源?快使用搜索试试~ 我知道了~
q star info.pdf
需积分: 0 32 下载量 155 浏览量
2024-03-04
16:06:21
上传
评论 1
收藏 15.78MB PDF 举报
温馨提示
试读
53页
q star info.pdf
资源推荐
资源详情
资源评论
Revealing OpenAI’s plan to create AGI by 2027
In this document I will be revealing information I have gathered regarding OpenAI’s (delayed) plans to create
human-level AGI by 2027. Not all of it will be easily verifiable but hopefully there’s enough evidence to
convince you
Summary: OpenAI started training a 125 trillion parameter multimodal model in August of 2022. The first
stage was Arrakis also called Q*. The model finished training in December of 2023 but the launch was
canceled due to high inference cost. This is the original GPT-5 which was planned for release in 2025. Gobi
(GPT-4.5) has been renamed to GPT-5 because the original GPT-5 has been canceled.
The next stage of Q*, originally GPT-6 but since renamed to GPT-7 (originally for release in 2026), has been
put on hold because of the recent lawsuit by Elon Musk
Q* 2025 (GPT-8) was planned to be released in 2027 achieving full AGI
...
Q* 2023 = 48 IQ
Q* 2024 = 96 IQ (delayed)
Q* 2025 = 145 IQ (delayed)
Elon Musk caused the delay because of his lawsuit. This is why I’m revealing the information now because
no further harm can be done
I’ve seen many definitions of AGI – artificial general intelligence – but I will define AGI simply as an
artificial intelligence that can do any intellectual task a smart human can. This is how most people
define the term now.
2020 was the first time I was shocked by an AI system – that was GPT-3. GPT-3.5, an upgraded
version of GPT-3, is the model behind ChatGPT. When ChatGPT was released, I felt as though the
wider world was finally catching up to something I was interacting with 2 years prior. I used GPT-3
extensively in 2020 and was shocked by its ability to reason.
GPT-3, and its half-step successor GPT-3.5 (which powered the now famous ChatGPT -- before it
was upgraded to GPT-4 in March 2023), were a massive step towards AGI in a way that earlier
models weren’t. The thing to note is, earlier language models like GPT-2 (and basically all chatbots
since Eliza) had no real ability to respond coherently at all. So why was GPT-3 such a massive leap?
...
Parameter Count
“Deep learning” is a concept that essentially goes back to the beginning of AI research in the 1950s.
The first neural network was created in the 50s, and modern neural networks are just “deeper”,
meaning, they contain more layers – they’re much, much bigger and trained on lots more data.
Most of the major techniques used in AI today are rooted in basic 1950s research, combined with a
few minor engineering solutions like “backpropogation” and “transformer models”. The overall point is
that AI research hasn’t fundamentally changed in 70 years. So, there’s only two real reasons for the
recent explosion of AI capabilities: size and data.
A growing number of people in the field are beginning to believe we’ve had the technical details of
AGI solved for many decades, but merely didn’t have enough computing power and data to build it
until the 21st century. Obviously, 21st century computers are vastly more powerful than 1950s
computers. And of course, the internet is where all the data came from.
So, what is a parameter? You may already know, but to give a brief digestible summary, it’s analogous
to a synapse in a biological brain, which is a connection between neurons. Each neuron in a
biological brain has roughly 1000 connections to other neurons. Obviously, digital neural networks are
conceptually analogous to biological brains.
...
…
So, how many synapses (or “parameters”) are in a human brain?
The most commonly cited figure for synapse count in the brain is roughly 100 trillion, which would
mean each neuron (~100 billion in the human brain) has roughly 1000 connections.
剩余52页未读,继续阅读
资源评论
aigozedd
- 粉丝: 0
- 资源: 2
上传资源 快速赚钱
- 我的内容管理 展开
- 我的资源 快来上传第一个资源
- 我的收益 登录查看自己的收益
- 我的积分 登录查看自己的积分
- 我的C币 登录后查看C币余额
- 我的收藏
- 我的下载
- 下载帮助
最新资源
- 课设毕设基于SSM的校园餐厅管理 LW+PPT+源码可运行.zip
- Python井字棋代码
- 课设毕设基于SSM的书店仓库管理系统2021 LW+PPT+源码可运行.zip
- 课设毕设基于SSM的沙县小吃点餐系统 LW+PPT+源码可运行.zip
- 课设毕设基于SSM的旅游景点线路网站 LW+PPT+源码可运行.zip
- EDA实验计数器CNT9999-DTCNT9999实验源代码
- 课设毕设基于SSM的抗疫医疗用品销售平台 LW+PPT+源码可运行.zip
- 基于Halcon的仿照VisonPro的机器视觉软件.zip
- battery-percentage-detector 使用 Javascript 的电池百分比检测器
- 毕业设计基于Qt+FFmpeg+SDL实现的音视频播放器源码.zip
资源上传下载、课程学习等过程中有任何疑问或建议,欢迎提出宝贵意见哦~我们会及时处理!
点击此处反馈
安全验证
文档复制为VIP权益,开通VIP直接复制
信息提交成功