Skip to content

Commit

Permalink
Fail FuncEval if slot backpatching lock is held by any thread (#2380)
Browse files Browse the repository at this point in the history
- In many cases cooperative GC mode is entered after acquiring the slot backpatching lock and the thread may block for debugger suspension while holding the lock. A FuncEval may time out on entering the lock if for example it calls a virtual or interface method for the first time. Failing the FuncEval when the lock is held enables the debugger to fall back to other options for expression evaluation.
- Also added polls for debugger suspension before acquiring the slot backpatching lock on background threads that often operate in preemptive GC mode. A common case is when the debugger breaks while the tiering delay timer is active, the timer ticks shortly afterwards (after debugger suspension completes) and if a thread pool thread is already available, the background thread would block while holding the lock. The poll checks for debugger suspension and pulses the GC mode to block before acquiring the lock.
- The fix is only a heuristic and lessens the problem when it is detected that the lock is held by some thread. Since the lock is acquired in preemptive GC mode, it is still possible that after the check at the start of a FuncEval, another thread acquires the lock and the FuncEval may time out. The polling makes it less likely for the lock to be taken by background tiering work, for example if a FuncEval starts while rejitting a method.
- The expression evaluation experience may be worse when it is detected that the lock is held, and may still happen from unfortunate timing

Fix for #1537
  • Loading branch information
kouvel authored Jan 30, 2020
1 parent 9a8f52c commit fc06054
Show file tree
Hide file tree
Showing 5 changed files with 83 additions and 18 deletions.
9 changes: 9 additions & 0 deletions src/coreclr/src/debug/ee/debugger.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -15309,6 +15309,15 @@ HRESULT Debugger::FuncEvalSetup(DebuggerIPCE_FuncEvalInfo *pEvalInfo,
return CORDBG_E_FUNC_EVAL_BAD_START_POINT;
}

if (MethodDescBackpatchInfoTracker::IsLockOwnedByAnyThread())
{
// A thread may have suspended for the debugger while holding the slot backpatching lock while trying to enter
// cooperative GC mode. If the FuncEval calls a method that is eligible for slot backpatching (virtual or interface
// methods that are eligible for tiering), the FuncEval may deadlock on trying to acquire the same lock. Fail the
// FuncEval to avoid the issue.
return CORDBG_E_FUNC_EVAL_BAD_START_POINT;
}

// Create a DebuggerEval to hold info about this eval while its in progress. Constructor copies the thread's
// CONTEXT.
DebuggerEval *pDE = new (interopsafe, nothrow) DebuggerEval(filterContext, pEvalInfo, fInException);
Expand Down
2 changes: 2 additions & 0 deletions src/coreclr/src/vm/callcounting.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -817,6 +817,7 @@ void CallCountingManager::CompleteCallCounting()
{
CodeVersionManager *codeVersionManager = appDomain->GetCodeVersionManager();

MethodDescBackpatchInfoTracker::PollForDebuggerSuspension();
MethodDescBackpatchInfoTracker::ConditionalLockHolder slotBackpatchLockHolder;

// Backpatching entry point slots requires cooperative GC mode, see
Expand Down Expand Up @@ -993,6 +994,7 @@ void CallCountingManager::StopAndDeleteAllCallCountingStubs()
TieredCompilationManager *tieredCompilationManager = GetAppDomain()->GetTieredCompilationManager();
bool scheduleTieringBackgroundWork = false;
{
MethodDescBackpatchInfoTracker::PollForDebuggerSuspension();
MethodDescBackpatchInfoTracker::ConditionalLockHolder slotBackpatchLockHolder;

ThreadSuspend::SuspendEE(ThreadSuspend::SUSPEND_OTHER);
Expand Down
32 changes: 24 additions & 8 deletions src/coreclr/src/vm/methoddescbackpatchinfo.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ void EntryPointSlots::Backpatch_Locked(TADDR slot, SlotType slotType, PCODE entr
// MethodDescBackpatchInfoTracker

CrstStatic MethodDescBackpatchInfoTracker::s_lock;
bool MethodDescBackpatchInfoTracker::s_isLocked = false;

#ifndef DACCESS_COMPILE

Expand Down Expand Up @@ -111,7 +112,6 @@ void MethodDescBackpatchInfoTracker::AddSlotAndPatch_Locked(MethodDesc *pMethodD
#endif // DACCESS_COMPILE

#ifdef _DEBUG

bool MethodDescBackpatchInfoTracker::IsLockOwnedByCurrentThread()
{
WRAPPER_NO_CONTRACT;
Expand All @@ -122,16 +122,32 @@ bool MethodDescBackpatchInfoTracker::IsLockOwnedByCurrentThread()
return true;
#endif
}
#endif // _DEBUG

bool MethodDescBackpatchInfoTracker::MayHaveEntryPointSlotsToBackpatch(PTR_MethodDesc methodDesc)
#ifndef DACCESS_COMPILE
void MethodDescBackpatchInfoTracker::PollForDebuggerSuspension()
{
// The only purpose of this method is to allow asserts in inline functions defined in the .h file, by which time MethodDesc
// is not fully defined
CONTRACTL
{
NOTHROW;
GC_TRIGGERS;
MODE_PREEMPTIVE;
}
CONTRACTL_END;

WRAPPER_NO_CONTRACT;
return methodDesc->MayHaveEntryPointSlotsToBackpatch();
}
_ASSERTE(!IsLockOwnedByCurrentThread());

#endif // _DEBUG
// If suspension is pending for the debugger, pulse the GC mode to suspend the thread here. Following this call, typically
// the lock is acquired and the GC mode is changed, and suspending there would cause FuncEvals to fail (see
// Debugger::FuncEvalSetup() at the reference to IsLockOwnedByAnyThread()). Since this thread is in preemptive mode, the
// debugger may think it's already suspended and it would be unfortunate to suspend the thread with the lock held.
Thread *thread = GetThread();
_ASSERTE(thread != nullptr);
if (thread->HasThreadState(Thread::TS_DebugSuspendPending))
{
GCX_COOP();
}
}
#endif

////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
53 changes: 43 additions & 10 deletions src/coreclr/src/vm/methoddescbackpatchinfo.h
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,7 @@ class MethodDescBackpatchInfoTracker
{
private:
static CrstStatic s_lock;
static bool s_isLocked;

class BackpatchInfoTrackerHashTraits : public NoRemoveDefaultCrossLoaderAllocatorHashTraits<MethodDesc *, UINT_PTR>
{
Expand Down Expand Up @@ -97,9 +98,23 @@ class MethodDescBackpatchInfoTracker
static bool IsLockOwnedByCurrentThread();
#endif

#ifndef DACCESS_COMPILE
public:
static bool IsLockOwnedByAnyThread()
{
LIMITED_METHOD_CONTRACT;
return VolatileLoadWithoutBarrier(&s_isLocked);
}

static void PollForDebuggerSuspension();
#endif

public:
class ConditionalLockHolder : private CrstHolderWithState
{
private:
bool m_isLocked;

public:
ConditionalLockHolder(bool acquireLock = true)
: CrstHolderWithState(
Expand All @@ -108,13 +123,37 @@ class MethodDescBackpatchInfoTracker
#else
nullptr
#endif
)
),
m_isLocked(false)
{
LIMITED_METHOD_CONTRACT;
WRAPPER_NO_CONTRACT;

#ifndef DACCESS_COMPILE
if (acquireLock)
{
_ASSERTE(IsLockOwnedByCurrentThread());
_ASSERTE(!s_isLocked);
m_isLocked = true;
s_isLocked = true;
}
#endif
}

ConditionalLockHolder(const ConditionalLockHolder &) = delete;
ConditionalLockHolder &operator =(const ConditionalLockHolder &) = delete;
~ConditionalLockHolder()
{
WRAPPER_NO_CONTRACT;

#ifndef DACCESS_COMPILE
if (m_isLocked)
{
_ASSERTE(IsLockOwnedByCurrentThread());
_ASSERTE(s_isLocked);
s_isLocked = false;
}
#endif
}

DISABLE_COPY(ConditionalLockHolder);
};

public:
Expand All @@ -123,16 +162,10 @@ class MethodDescBackpatchInfoTracker
LIMITED_METHOD_CONTRACT;
}

#ifdef _DEBUG
public:
static bool MayHaveEntryPointSlotsToBackpatch(PTR_MethodDesc methodDesc);
#endif

#ifndef DACCESS_COMPILE
public:
void Backpatch_Locked(MethodDesc *pMethodDesc, PCODE entryPoint);
void AddSlotAndPatch_Locked(MethodDesc *pMethodDesc, LoaderAllocator *pLoaderAllocatorOfSlot, TADDR slot, EntryPointSlots::SlotType slotType, PCODE currentEntryPoint);
public:
#endif

DISABLE_COPY(MethodDescBackpatchInfoTracker);
Expand Down
5 changes: 5 additions & 0 deletions src/coreclr/src/vm/tieredcompilation.cpp
Original file line number Diff line number Diff line change
Expand Up @@ -450,6 +450,7 @@ void TieredCompilationManager::DeactivateTieringDelay()
COUNT_T methodCount = methodsPendingCounting->GetCount();
CodeVersionManager *codeVersionManager = GetAppDomain()->GetCodeVersionManager();

MethodDescBackpatchInfoTracker::PollForDebuggerSuspension();
MethodDescBackpatchInfoTracker::ConditionalLockHolder slotBackpatchLockHolder;

// Backpatching entry point slots requires cooperative GC mode, see
Expand Down Expand Up @@ -815,6 +816,10 @@ void TieredCompilationManager::ActivateCodeVersion(NativeCodeVersion nativeCodeV
HRESULT hr = S_OK;
{
bool mayHaveEntryPointSlotsToBackpatch = pMethod->MayHaveEntryPointSlotsToBackpatch();
if (mayHaveEntryPointSlotsToBackpatch)
{
MethodDescBackpatchInfoTracker::PollForDebuggerSuspension();
}
MethodDescBackpatchInfoTracker::ConditionalLockHolder slotBackpatchLockHolder(mayHaveEntryPointSlotsToBackpatch);

// Backpatching entry point slots requires cooperative GC mode, see
Expand Down

0 comments on commit fc06054

Please sign in to comment.