User profiles for Tang Jie

Tang Jie

- Verified email at tsinghua.edu.cn - Cited by 38952

Jie Tang

- Verified email at wisc.edu - Cited by 15411

Jie Tang (唐洁)

- Verified email at scut.edu.cn - Cited by 2304

Edge computing for autonomous driving: Opportunities and challenges

S Liu, L Liu, J Tang, B Yu, Y Wang… - Proceedings of the …, 2019 - ieeexplore.ieee.org
Safety is the most important requirement for autonomous vehicles; hence, the ultimate
challenge of designing an edge computing ecosystem for autonomous vehicles is to deliver …

Electrochemical nanobiosensors

M Pumera, S Sanchez, I Ichinose, J Tang - Sensors and Actuators B …, 2007 - Elsevier
This review discusses main techniques and methods which use nanoscale materials for
construction of electrochemical biosensors. Described approaches include nanotube and …

Openai gym

…, L Pettersson, J Schneider, J Schulman, J Tang… - arXiv preprint arXiv …, 2016 - arxiv.org
OpenAI Gym is a toolkit for reinforcement learning research. It includes a growing collection
of benchmark problems that expose a common interface, and a website where people can …

Dota 2 with large scale deep reinforcement learning

…, J Schneider, S Sidor, I Sutskever, J Tang… - arXiv preprint arXiv …, 2019 - arxiv.org
On April 13th, 2019, OpenAI Five became the first AI system to defeat the world champions
at an esports game. The game of Dota 2 presents novel challenges for AI systems such as …

CD24: from A to Z

X Fang, P Zheng, J Tang, Y Liu - Cellular & molecular immunology, 2010 - nature.com
As a testament to the importance of CD24, researchers with diverse interests, including
adaptive immunity, inflammation, autoimmune diseases and cancer, have encountered CD24. …

Evaluating large language models trained on code

…, WH Guss, A Nichol, A Paino, N Tezak, J Tang… - arXiv preprint arXiv …, 2021 - arxiv.org
We introduce Codex, a GPT language model fine-tuned on publicly available code from
GitHub, and study its Python code-writing capabilities. A distinct production version of Codex …

Self-supervised learning: Generative or contrastive

…, L Mian, Z Wang, J Zhang, J Tang - IEEE transactions on …, 2021 - ieeexplore.ieee.org
Deep supervised learning has achieved great success in the last decade. However, its defects
of heavy dependence on manual labels and vulnerability to attacks have driven people to …

Arnetminer: extraction and mining of academic social networks

J Tang, J Zhang, L Yao, J Li, L Zhang, Z Su - Proceedings of the 14th …, 2008 - dl.acm.org
This paper addresses several key issues in the ArnetMiner system, which aims at extracting
and mining academic social networks. Specifically, the system focuses on: 1) Extracting …

[HTML][HTML] GPT understands, too

X Liu, Y Zheng, Z Du, M Ding, Y Qian, Z Yang, J Tang - AI Open, 2023 - Elsevier
Prompting a pretrained language model with natural language patterns has been proved
effective for natural language understanding (NLU). However, our preliminary study reveals …

P-tuning v2: Prompt tuning can be comparable to fine-tuning universally across scales and tasks

…, K Ji, Y Fu, WL Tam, Z Du, Z Yang, J Tang - arXiv preprint arXiv …, 2021 - arxiv.org
Prompt tuning, which only tunes continuous prompts with a frozen language model, substantially
reduces per-task storage and memory usage at training. However, in the context of NLU…