📢 Gate Square #Creator Campaign Phase 1# is now live – support the launch of the PUMP token sale!
The viral Solana-based project Pump.Fun ($PUMP) is now live on Gate for public sale!
Join the Gate Square Creator Campaign, unleash your content power, and earn rewards!
📅 Campaign Period: July 11, 18:00 – July 15, 22:00 (UTC+8)
🎁 Total Prize Pool: $500 token rewards
✅ Event 1: Create & Post – Win Content Rewards
📅 Timeframe: July 12, 22:00 – July 15, 22:00 (UTC+8)
📌 How to Join:
Post original content about the PUMP project on Gate Square:
Minimum 100 words
Include hashtags: #Creator Campaign
Google Open Source Gemma-3: Comparable to DeepSeek, Computing Power Plummets
Jinshi data news on March 13th, last night, Google (GOOG.O) CEO Sundar Pichai announced that the latest multimodal large model GEMMA-3 open source, featuring low cost and high performance. Gemma-3 has four sets of parameters: 1 billion, 4 billion, 12 billion, and 27 billion. Even with the largest 27 billion parameters, only one H100 is needed for efficient inference, which is at least 10 times more Computing Power efficient than similar models to achieve this effect, making it the currently most powerful small parameter model. According to blind test LMSYS ChatbotArena data, Gemma-3 ranks second only to DeepSeek's R1-671B, higher than OpenAI's o3-mini, Llama3-405B, and other well-known models.